How to convert C# DateTime.Ticks into Unix timestamp

C# DateTime.Ticks ConverterWith the .NET frameworks DateTime functions you can do a lot of nice things. The handling turns out, in my opinion, very pleasant. The only requirement: You find yourself in a pure .NET environment. When other systems come into play, the trouble begins. But why is it that you can not compare DateTime.Ticks with the PHP mktime()-function?

If you request the “timestamp” from a DateTime-object (DateTime.Ticks), so you get back the number of ticks since 01.01.0001 00:00. A tick in turn is 100 nanoseconds long.

A Unix timestamp, as produced by mktime() for example, is to the contrary, the number of seconds since 01/01/1970. A direct comparison is not possible. So you have to convert between the both units at first. And how to do this, is what I want to show you today, based on a few short snippets.

Unix-Timestamp to DateTime.Ticks

private static DateTime TimeFromUnixTimestamp(int unixTimestamp)
{
    DateTime unixYear0 = new DateTime(1970, 1, 1);
    long unixTimeStampInTicks = unixTimestamp * TimeSpan.TicksPerSecond;
    DateTime dtUnix = new DateTime(unixYear0.Ticks + unixTimeStampInTicks);
    return dtUnix;
}

First, a DateTime object that the date of commencement of the Unix era indicates is created. Then the Unix timestamp (specified in seconds) is converted to ticks. Finally, another DateTime object is created by adding the ticks of the Unix epoch start object and the Unix ticks.

DateTime.Ticks to Unix-Timestamp

public static long UnixTimestampFromDateTime(DateTime date)
{
    long unixTimestamp = date.Ticks - new DateTime(1970, 1, 1).Ticks;
    unixTimestamp /= TimeSpan.TicksPerSecond;
    return unixTimestamp;
}

First, the number of ticks is from the beginning of the DateTime era (see Introduction) is deducted by the ticks of the beginning of the Unix era. After that you almost have a Unix timestamp – only in ticks instead of seconds. To finalize the conversion process, the ticks have to be converted in seconds. (TimeSpan.TicksPerSecond is our friend)

[Java] System.currentTimeMillis() to DateTime.Ticks

private static DateTime TimeFromJavaTimestamp(long javaTimestamp)
{
    return TimeFromUnixTimestamp((int)(javaTimestamp / 1000));
}

The Java timestamp (System.currentTimeMillis()) is almost identical to the standard Unix timestamp. The era also begins on 01/01/1970. The only difference is that the timestamp is given in milliseconds (in Java), instead of seconds. Thus we can make use of the above function for the conversion. Just use the above explained function and simply divide the result by 1000.

DateTime.Ticks to [Java] System.currentTimeMillis()

public static long JavaTimestampFromDateTime(DateTime date)
{
    return (UnixTimestampFromDateTime(date) * 1000);
}

To convert DateTime.Ticks into a Java timestamp we need to create a Unix timestamp out of the .NET timestamp and multiply by 1000. (Why this is so I had described already in the above paragraph.)

No products found.

I hope this article helped the one or the other seeker. Should you have any questions or suggestions, please drop me a comment. I am willing to revise the article, if you have a better suggestion. (Learning never stops…)

4 Comments

  1. Dave Mertenssays:

    This is a very complicated way to convert to and from a unix timestamp.

    I was pointed to this post by an intern who said he got the code from this website.
    Here’s a far better approach and it is certainly better to read (and understand)

    public static class DateTimeFunctions
    {
    private static readonly DateTime epoch = new DateTime(1970, 1, 1);
    public static DateTime DateTimeFromUnixTimestamp(int timestamp)
    {
    return epoch.AddSeconds(timestamp);
    }

    public static int DateTimeToUnixTimestamp(DateTime date)
    {
    return (int)date.Subtract(epoch).TotalSeconds;
    }

    //extension method
    public static int ToUnixTimestamp(this DateTime date)
    {
    return DateTimeToUnixTimestamp(date);
    }
    }

    I hope that the formatting was preserved.

  2. Tobias Knausssays:

    It’s nonsense to call Java conversion functions from Unix conversion functions. You lose the milliseconds precision.

    • Hi Tobias,

      sure, you lose precision. But let’s say you read data from a thirdparty database, which contains timestamps in unix format. In that case, you don’t care about losing precision on C# side, because the data of interest is expressed without this precision grade.

      I get your point, and yes, should think twice before doing conversions with precision-loss, but as shown above, there are use cases, where you have to handle Unix timestamps.

  3. daniel reznicksays:

    The conversions to/from ticks/javatimestamp lose all subsecond precision that may exist on either side. Suggest anyone needing these conversions (ticks/java) look elsewhere.

Leave a Reply to Dave Mertens Cancel reply

Please be polite. We appreciate that. Your email address will not be published and required fields are marked