Unix Timestamp Converter
Convert a Unix (epoch) timestamp to a human-readable date and time, or convert any date back to a Unix timestamp. Displays the current timestamp live. Runs entirely in your browser — no data uploaded.
Timestamp → Date
Date → Timestamp
How Unix Timestamps Work
Unix time counts the seconds since the Unix epoch: midnight UTC on 1 January 1970. It is timezone-agnostic — a timestamp represents the same moment in time everywhere on earth. Converting to a human date requires applying a timezone offset.
Seconds vs Milliseconds
| Format | Digits | Example | Used in |
|---|---|---|---|
| Seconds | 10 | 1713312000 | Unix, Python, databases |
| Milliseconds | 13 | 1713312000000 | JavaScript Date.now() |
| Microseconds | 16 | 1713312000000000 | High-precision logs |
The Year 2038 Problem
32-bit systems store Unix timestamps as signed integers. The maximum value (2,147,483,647) corresponds to 03:14:07 UTC on 19 January 2038. After this moment, 32-bit counters overflow to a negative number, causing date calculation errors. Modern 64-bit systems are not affected and will handle timestamps until the year 292 billion.
Frequently Asked Questions
What is a Unix timestamp?
The number of seconds since 00:00:00 UTC on 1 January 1970. It is timezone-independent and universally used in programming and databases.
Seconds vs milliseconds timestamps?
Standard Unix is seconds (10 digits). JavaScript Date.now() gives milliseconds (13 digits). Divide by 1000 to convert ms to seconds.
What happens in 2038?
32-bit systems overflow their signed integer timestamp on 19 January 2038. 64-bit systems are unaffected.
Is Unix time timezone-aware?
Unix timestamps are always UTC. You apply a timezone offset when displaying as a human-readable date.