I’m currently working on a JavaScript function that formats dates using the Intl.DateTimeFormat
constructor. The function takes a date
string as input and formats it to a specific time zone using the timeZone option.
Here’s a simplified version of the function:
const getFormattedDateString = (date) => {
return new Intl.DateTimeFormat('en-US', { dateStyle: 'full', timeStyle: 'full', timeZone: 'Africa/Casablanca' }).format(new Date(date));
};
As you can see I use the constructor new Date(date)
before passing it to the format
function. My understanding is that if the input date parameter is in UTC (e.g., ‘2023-06-19T18:24:41Z’), the formatted date string should remain consistent across different local time zones (where ever the user is in the world). However, if the input date parameter is not in UTC (e,g., ‘2023-06-19’), then formatting it to a different time zone will result in different formatted date strings depending on the where in the world the function is executed.
Here are my two questions :
-
So if I want the result of the function to be the same anywhere in the world (for a specific UTC date) I should make sure the
date
parameter is in UTC. Can anyone confirm this ? -
I don’t understand something. I saw that when I do this :
console.log(new Date("2023-06-19T18:24:41Z"))
It gives me an date object in my timezone. So why does using it in the function above makes the date object consistent regardless of the local of the user running the code. There’s a difference from what I log in the console vs the date object ?
2
Answers
The parsing done by the
Date
object is fully specified, so you can rely on it for strings in the format it defines (now; that wasn’t always true). If the string has a timezone indicator on it (likeZ
for UTC), then the string defines a datetime in that timezone. The tricky bit is that if you don’t have a timezone indicator in the string, the default is local time if the string has a time (2024-02-09T08:23
) but UTC if it doesn’t (2024-02-09
).A
Date
object is just a set of methods wrapped around a number of milliseconds since The Epoch (1970-01-01T00:00:00Z
). Some of those methods (e.g.getHours
) use the local timezone of the environment the code is running in. Others (e.g.getUTCHours
) use UTC.Date
objects have no concept of timezones other than UTC and the local timezone of the environment where the code is running. With yourconsole.log
, you’re apparently seeing the result of formatting the date in local time (it varies depending on the console implementation).In contrast, with your
Intl.DateTimeFormat
code, you’re telling it what timezone to use (timeZone: 'Africa/Casablanca'
), so of course the resulting string uses that timezone.If the timestamp is in one of the formats supported by ECMAScript and includes an offset, then all parsers should parse it to exactly the same time value (millisecond offset from the ECMAScript epoch). It doesn’t need to be UTC.
In regard to supported formats:
If the format of toString is used, any valid offset (including non–zero values) will produce consistent results. E.g.
Similarly with timestamps consistent with the Date Time String Format:
Both the above produce the time value
1707477972000
regardless of the host timezone or offset settings. For the toString format, deleting the timezone name (or changing it to a nonsensical value) has no effect. Deleting the offset makes it inconsistent with ECMA-262 so parsing is implementation dependent (i.e. unreliable).For the Date Time String Format, if the offset is removed, timestamps are parsed as local.
The timestamp is one of the supported formats so all parsers should parse it to exactly the same time value (1687199081000). The trailing "Z" indicates zero offset (UTC) and is the same as "GMT+00:00".
As T.J. Crowder explained, it’s an ECMAScript quirk (some might say bug) that timestamps of the format YYYY-MM-DD are parsed as UTC, but any other timestamp without an offset is parsed as local to the host doing the parsing.
PS
A pedant would say that UTC is a time standard, not a timezone. 😉