Skip to content

Unix Timestamps Explained: A Developer's Guide

Unix timestamps are one of the most fundamental concepts in software development. Every time you work with dates in a database, API, or log file, you are likely dealing with Unix time under the hood. Understanding how it works will save you from subtle bugs and make you more effective when debugging time-related issues.

What is a Unix Timestamp?

A Unix timestamp (also called epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970, 00:00:00 UTC. This moment is known as the Unix epoch. For example, the timestamp 1700000000 corresponds to November 14, 2023, 22:13:20 UTC.

The beauty of Unix timestamps is their simplicity. A single integer represents an exact moment in time, regardless of time zones, daylight saving time, or calendar systems. This makes them ideal for storing, comparing, and transmitting dates in software.

Why January 1, 1970?

The choice of epoch was somewhat arbitrary. When Ken Thompson and Dennis Ritchie were developing Unix at Bell Labs in the early 1970s, they needed a reference point. January 1, 1970 was chosen as a round, recent date. The original implementation used a 32-bit signed integer, which could represent dates from December 13, 1901 to January 19, 2038.

The Year 2038 Problem

A 32-bit signed integer can store values up to 2,147,483,647. This corresponds to January 19, 2038, 03:14:07 UTC. After this moment, 32-bit timestamps will overflow and wrap around to negative values, representing dates in 1901. This is known as the Y2K38 problem.

Most modern systems have already migrated to 64-bit timestamps, which can represent dates billions of years into the future. JavaScript's Date object uses milliseconds stored as a 64-bit float, so it is not affected. However, if you work with embedded systems, legacy databases, or 32-bit architectures, the 2038 problem is something to be aware of.

Seconds vs Milliseconds

Different systems use different precisions for timestamps:

  • Seconds (10 digits, e.g. 1700000000): Used by Unix/Linux, PHP, Python's time.time(), and most APIs.
  • Milliseconds (13 digits, e.g. 1700000000000): Used by JavaScript's Date.now(), Java's System.currentTimeMillis().
  • Microseconds (16 digits): Used by PostgreSQL timestamps and some high-precision systems.
  • Nanoseconds (19 digits): Used by Go's time.Now().UnixNano().

A quick way to tell them apart: count the digits. If your timestamp has 10 digits, it is in seconds. If it has 13 digits, it is in milliseconds.

Converting Timestamps in Different Languages

  • JavaScript: new Date(1700000000 * 1000).toISOString() converts seconds to an ISO string. Math.floor(Date.now() / 1000) gives the current timestamp in seconds.
  • Python: datetime.fromtimestamp(1700000000, tz=timezone.utc) converts to a datetime object.
  • SQL (PostgreSQL): SELECT to_timestamp(1700000000) converts to a timestamp with time zone.
  • Bash: date -d @1700000000 on Linux or date -r 1700000000 on macOS.

Common Pitfalls

  • Mixing seconds and milliseconds. Passing milliseconds to a function expecting seconds (or vice versa) gives wildly wrong dates. Always check the expected unit.
  • Ignoring time zones. Unix timestamps are always in UTC. When displaying dates to users, convert to their local time zone. When storing dates, always store in UTC.
  • String comparison of dates. Comparing date strings lexicographically can give wrong results. Convert to timestamps first, then compare the numbers.
  • Floating-point timestamps. Some languages return timestamps as floats. Be careful with precision loss when converting between float and integer representations.

Best Practices

  • Store timestamps in UTC. Always. Convert to local time only at the presentation layer.
  • Use ISO 8601 format (2026-03-09T12:00:00Z) for human-readable date exchange in APIs alongside or instead of raw timestamps.
  • Document whether your API returns seconds or milliseconds. This prevents integration bugs.
  • Use 64-bit integers for any new system. There is no reason to use 32-bit timestamps in 2026.

Try it yourself

Convert between Unix timestamps and human-readable dates instantly. Supports seconds and milliseconds. Runs entirely in your browser.

Open Timestamp Converter