Unix Timestamp Converter
Convert Unix timestamps to dates and back. Supports seconds, milliseconds, ISO 8601, and relative time.
Runs entirely in your browser — no data sent to servers.
CURRENT UNIX TIME
UNIX TIMESTAMP
Enter seconds (10 digits) or milliseconds (13 digits)
Enter a Unix timestamp or pick a date above
What Is a Unix Timestamp?
A Unix timestamp (also called Unix time, POSIX time, or epoch time) is the number of seconds that have elapsed since January 1, 1970, 00:00:00 UTC — the Unix epoch. It is the most widely used time representation in programming because it is timezone-independent and easily computed.
As of 2026, the current Unix timestamp is approximately 1,744,000,000 (ten digits). Millisecond-precision timestamps (used in JavaScript with Date.now()) are thirteen digits.
Getting the Current Timestamp in Code
// Milliseconds (13 digits) Date.now() // → 1744123456789 // Seconds (10 digits) Math.floor(Date.now() / 1000) // → 1744123456
import time # Seconds (float) time.time() # → 1744123456.789 # Seconds (int) int(time.time())
import "time" // Seconds time.Now().Unix() // Milliseconds time.Now().UnixMilli()
-- Current timestamp SELECT EXTRACT(EPOCH FROM NOW())::INT; -- Convert to timestamp SELECT TO_TIMESTAMP(1744123456);
The Year 2038 Problem
Systems that store Unix timestamps as 32-bit signed integers will overflow on January 19, 2038 at 03:14:07 UTC — the maximum value for a 32-bit signed integer. After this point, the timestamp wraps around to negative values representing a date in 1901.
Modern systems use 64-bit integers, which can represent times for approximately 292 billion years. If you're working with legacy systems, ensure your timestamp storage uses BIGINT (not INT) for timestamp columns.
Seconds vs Milliseconds
A common source of bugs: mixing up second and millisecond timestamps.
- 10 digits — Unix seconds (e.g.,
1744123456) - 13 digits — Unix milliseconds, used in JavaScript (e.g.,
1744123456789) - 16 digits — Microseconds (PostgreSQL, some logging systems)
This converter automatically detects whether your input is seconds or milliseconds based on the number of digits.
Built by Noah AI Labs · Part of the free developer tools suite