🔒 All tools run entirely in your browser. No data leaves your device.

Timestamp Converter

Convert Unix timestamps, ISO 8601, and RFC 2822 dates. Live current time in multiple formats with timezone support. Runs in your browser.

Current time live

Unix (seconds)
1777079237
Unix (milliseconds)
1777079237143
ISO 8601 (UTC)
2026-04-25T01:07:17.143Z
RFC 2822 / HTTP
Sat, 25 Apr 2026 01:07:17 GMT
Local (UTC)
Saturday, April 25, 2026 at 1:07:17 AM UTC

Convert a timestamp

About this tool

Unix time is the number of seconds that have elapsed since 1970-01-01 00:00:00 UTC. It’s the lingua franca of computer timestamps — compact, sortable, timezone-independent, and unambiguous. ISO 8601 (2026-04-24T17:00:00Z) is the most common human-readable timestamp format and is what you should emit whenever a human might read the value.

Common sources of confusion: seconds vs. milliseconds — Unix timestamps are normally seconds, but JavaScript APIs and some databases use milliseconds. A timestamp in the wrong unit can render a thousand times earlier or later than intended. UTC vs. local2026-04-24T17:00:00 without a Z or offset is ambiguous and most systems will interpret it as local time. Emitting an ISO 8601 value without a timezone is a common bug.

The Y2038 problem is the 32-bit equivalent of Y2K: at 03:14:07 UTC on 19 January 2038, a signed 32-bit Unix timestamp overflows. Any system still using 32-bit time_t will suddenly see a date in 1901. Most 64-bit platforms have resolved this; embedded systems and older file formats may not have.

This tool supports timezone-aware conversion via the browser’s Intl APIs, which handle daylight-saving transitions correctly for every named zone. It picks up your local zone automatically and lets you switch to any common one.

Frequently asked questions

What is the difference between Unix time and ISO 8601?

Unix time is a single integer — the number of seconds (or milliseconds) since 1970-01-01 UTC. ISO 8601 is a human-readable date format like "2026-04-24T17:00:00Z". Unix time is compact and always unambiguous; ISO 8601 is readable.

Why do some APIs use seconds and others milliseconds?

JavaScript’s Date object uses milliseconds natively, so JS-heavy APIs often expose milliseconds. Databases, Unix shell tools, and most backend APIs use seconds. The tool auto-detects based on length: 10 digits → seconds, 13 digits → milliseconds.

How do I convert a timestamp to a specific timezone?

Pick a timezone in the dropdown. The "in {zone}" row in the results shows the same moment in that zone. The underlying timestamp does not change — only the display.

What is the Y2038 problem?

Unix timestamps stored in signed 32-bit integers overflow at 03:14:07 UTC on 19 January 2038. Any system still using 32-bit Unix time will misinterpret timestamps after that. Most modern systems use 64-bit integers, which postpone the problem by roughly 292 billion years.

Why does my timestamp show as a date in 1970 or the far future?

Almost always a seconds/milliseconds confusion. A value like `1745500800` interpreted as milliseconds renders as 21 January 1970. A value like `1745500800000` interpreted as seconds renders as 57235 AD. The tool warns when the parsed year is unreasonable.

Is daylight saving time handled?

Yes. The browser’s Intl.DateTimeFormat is timezone-aware and applies DST rules correctly for any named zone (America/New_York etc.).