Unix Timestamp Converter
Convert between Unix timestamps and human-readable dates. Auto-detects seconds vs milliseconds. All processing in your browser.
Common Timestamps
Frequently Asked Questions
What is a Unix timestamp?
A Unix timestamp (also called Epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970 00:00:00 UTC. It provides a universal, timezone-independent way to represent a point in time and is used extensively in programming, databases, and APIs.
What is the Year 2038 problem?
Many systems store Unix timestamps as 32-bit signed integers, which can represent dates up to January 19, 2038 at 03:14:07 UTC (timestamp 2,147,483,647). After this point, the value overflows to a negative number, causing dates to wrap back to 1901. Most modern systems now use 64-bit integers to avoid this issue.
Seconds vs milliseconds — how do I know which one I have?
Unix timestamps in seconds are typically 10 digits long (e.g., 1700000000), while millisecond timestamps are 13 digits (e.g., 1700000000000). This tool auto-detects the format: if the number is greater than 1 trillion, it treats it as milliseconds and divides by 1000.
What date formats can I enter?
You can enter dates in most common formats: ISO 8601 (2024-01-15T10:30:00Z), US format (Jan 15 2024), simple date (2024-01-15), or date with time (2024-01-15 10:30). The tool uses your browser's date parser, so most natural date formats will work.