⏰ Understanding Unix Timestamps: A Developer's Guide

If you've worked with databases, APIs, or any system that tracks time, you've encountered Unix timestamps. This simple yet powerful concept is fundamental to how computers handle time. Let's demystify it.

What is a Unix Timestamp?

A Unix timestamp (also called "epoch time" or "POSIX time") is the number of seconds that have elapsed since January 1, 1970, 00:00:00 UTC. This moment is called the "Unix epoch."

Example: The timestamp 1705276800 represents January 15, 2024, 00:00:00 UTC. It's simply the count of seconds since the epoch.

Why 1970? When Unix was being developed in the early 1970s, developers needed a starting point. January 1, 1970 was chosen as a convenient, recent date that predated the system's creation.

Why Developers Love Timestamps

Working with Timestamps

JavaScript

// Current timestamp (in seconds)
const timestamp = Math.floor(Date.now() / 1000);

// Timestamp to Date
const date = new Date(timestamp * 1000);

// Date to timestamp
const ts = Math.floor(new Date('2024-01-15').getTime() / 1000);

⚠️ JavaScript Gotcha

JavaScript's Date.now() returns milliseconds, not seconds. Always divide by 1000 for Unix timestamps.

Python

import time
from datetime import datetime

# Current timestamp
timestamp = int(time.time())

# Timestamp to datetime
dt = datetime.fromtimestamp(timestamp)

# Datetime to timestamp
ts = int(datetime(2024, 1, 15).timestamp())

PHP

// Current timestamp
$timestamp = time();

// Timestamp to date
$date = date('Y-m-d H:i:s', $timestamp);

// Date to timestamp
$ts = strtotime('2024-01-15');

The Year 2038 Problem

⚠️ Critical Issue: On January 19, 2038, at 03:14:07 UTC, 32-bit Unix timestamps will overflow. Systems still using 32-bit signed integers for time will see dates wrap around to 1901 or crash entirely.

The maximum value of a signed 32-bit integer is 2,147,483,647 seconds after the epoch, which equals January 19, 2038. After this moment, the counter overflows.

Modern systems address this by using 64-bit timestamps, which won't overflow for about 292 billion years. Most programming languages and databases have already transitioned.

Common Pitfalls

1. Timezone Confusion

Timestamps are always UTC. When converting to local time, remember to apply the timezone offset:

// Wrong: treats UTC timestamp as local time
new Date(1705276800000) // may show wrong time

// Right: explicitly handle timezone
new Date(1705276800000).toLocaleString('en-US', {timeZone: 'America/New_York'})

2. Milliseconds vs Seconds

JavaScript uses milliseconds. Unix traditionally uses seconds. Always check which your system expects:

3. Negative Timestamps

Dates before 1970 have negative timestamps. December 31, 1969 at 23:59:59 UTC is timestamp -1. Some systems don't handle negative timestamps correctly.

4. Leap Seconds

Unix time doesn't account for leap seconds. It assumes every day is exactly 86,400 seconds. For most applications this doesn't matter, but for high-precision time systems, consider NTP or TAI.

Useful Timestamp Values

Seconds in a minute: 60
Seconds in an hour: 3,600
Seconds in a day: 86,400
Seconds in a week: 604,800
Seconds in a year: 31,536,000 (approximately)

// Quick math
tomorrow = now + 86400
nextWeek = now + 604800
yesterday = now - 86400

When to Use Timestamps vs Date Strings

Use timestamps for:

Use date strings (ISO 8601) for:

🔧 Need to Convert Timestamps?

Use our free Unix Timestamp Converter to convert between timestamps and human-readable dates.

Open Timestamp Converter →

Conclusion

Unix timestamps are beautifully simple: count seconds from a fixed point. This simplicity makes them perfect for storing, comparing, and transmitting time data. Just remember the key conversions, watch out for the millisecond/second distinction, and always think in UTC.

With timestamps in your toolkit, handling time in any application becomes straightforward.