Why does the comparison of two sets of identical bytes take a different realization time?


I am trying to compute hashes and then compare them to simulate timing attack in c#
This is the code i am using for this purpose:

private void btnHash_Click(object sender, EventArgs e)
    MD5 md5 = new MD5CryptoServiceProvider();
    var firstHashByte = md5.ComputeHash(ASCIIEncoding.ASCII.GetBytes(txtBoxText.Text));
    txtBoxHash.Text = Convert.ToBase64String(firstHashByte);

    var secondHashByte = md5.ComputeHash(ASCIIEncoding.ASCII.GetBytes(txtBoxSecondText.Text));
    txtBoxHashtwo.Text = Convert.ToBase64String(secondHashByte);

    Stopwatch stopwatch = new Stopwatch();

    NormalEquality(firstHashByte, secondHashByte);
    //SlowEquals(firstHashByte, secondHashByte);


private static void NormalEquality(byte[] hashByte, byte[] hashByte2)
    bool bEqual = false;
    if (hashByte.Length == hashByte2.Length)
        int i = 0;
        while ((i < hashByte2.Length) && (hashByte2[i] == hashByte[i]))
            i += 1;
        if (i == hashByte2.Length)
            bEqual = true;

each time i try to run this, i get different times for even identical hashes! Why is it happening?
I also noticed that using the following method that is said to generate a constant time for both identical and different hashes failes to do so, and it acts just like the previous method, generating different times for just anything! (identical hashesh or different hashes!)

    private static bool SlowEquals(byte[] a, byte[] b)
        uint diff = (uint)a.Length ^ (uint)b.Length;
        for (int i = 0; i < a.Length && i < b.Length; i++)
            diff |= (uint)(a[i] ^ b[i]);
        return diff == 0;

Why is it like this ? Any idea?

( By the way as a side question:
Does C# string == comparison internally does this kind of array comparison or it is just another story?
Since whenever i tried to use == string on a Base64 version of hashes, i got 0 time, both for identical and different hashes I did :

if ( firstHashString == secondHashString);


You get different times because the stopwatch resolution is too low to measure so little code. Eventhough the resolution of the stopwatch is very high, the CPU still have time to run thousands of instructions between every tick of the stopwatch.

During the execution of the method the stopwatch will only go a few ticks, so the resulting times will vary very much.

If you run the method for example a thousand times or a million times, you get enough work to measure with small enough time variations.