Skip to Content.
Sympa Menu

microid - Re: [Microid] MicroID hashing algorithm(s) and normalization

microid AT lists.ibiblio.org

Subject: Microid mailing list

List archive

Chronological Thread  
  • From: Fred Stutzman <fred AT metalab.unc.edu>
  • To: Eran Sandler <eran AT yedda.com>
  • Cc: microid AT lists.ibiblio.org
  • Subject: Re: [Microid] MicroID hashing algorithm(s) and normalization
  • Date: Tue, 28 Nov 2006 13:21:45 -0500 (EST)

On Tue, 28 Nov 2006, Eran Sandler wrote:

Hello all,

In Yedda, instead of normalizing the URL used, we are using the URL from the
request itself (as it is passed in the HTTP header) and use it to create the
microid.

It does raises a few issues, one of which is the fact that we get two
different microids if we access a page with or without a trailing slash.

The other is the unnecessary re-computation of the microid every request.

My suggestion regarding normalization is relatively easy:

- Make sure the URL is lower cased

- If the hostname start's with "www." (i.e.
http://www.microid.org/somewhere/), remove the "www." part (i.e.
http://microid.org/somewhere/)

- If the URL ends with a file (including a file extension, i.e.
http://microid.org/somewhere/this.html), leave it as it is

- If the URL ends with a directory or path (not a file, i.e.
http://microid.org/somewhere), make sure it has a trailing slash at the end
(i.e. http://microid.org/somewhere/)

While I see where you are coming from, I think that for a MicroID system to be flexible, computation based on inbound URL is key. That inbound computation adds overhead, but likely none more than any other dynamic elements in the page.

If I attempt to claim a page, the verifier should first look to see if the hash matches the URI I gave it. After the fact, the verifier can also check normalized URI's (i.e. the verifier can compute hashes for all the cases mentioned above).

I think this problem is easier to solve on the verifier end than the producer end. Depending on how strict a verifier wants to be, they can reject normalized URI's, or accept them. For the most part it seems very safe to check for normalized URI's. Normalizing at the producer level would add complexity.

Regarding hashing algorithm, I liked the suggestion specified in the blog
post's comments (http://microid.org/blog/?p=4)of specifying the exact
algorithm near the hash value. The main problem with this approach is that
the various systems like claimid.com will have to handle quite a few of
these hashing algorithms (unless the spec specifically gives a couple of
options and that's it).

I think I agree with you on this though I can't claim to be an expert on hasing algos. It is easy for verifiers to verify aginst a number of predictable outcomes. Once we agree on a spec we can decide what becomes legacy and what becomes mainstream. It will be easy for verifiers to maintain legacy for some time.

-Fred





What do you say?



Eran





--
Fred Stutzman
claimID.com
919-260-8508
AIM: chimprawk





Archive powered by MHonArc 2.6.24.

Top of Page