Send comments about this site to lahosken+w@gmail.com.
Jamie Zawinski sent in this comment.
From: Jamie Zawinski
Date: 1997 Dec 14
Subj: Re: dadadodo
Lawrence Hosken wrote:
> You've made a wonderful thing. Thank you.Yours is nice too! Is that really unfiltered output, or did you do some selection in there? The sentences are awfully good.
> Figure out the probability tables for depth-N and depth-N-1. > To figure out the next word: > flip a coinThat's interesting, I think I'll try that... Hmm, the unfortunate thing about that is that it makes the *generation* of the chain be probabilistic. If you use the same text as input twice in a row, you don't get the same histogram out. I'm not sure that sits well with me...
> This might break down if I started reading in huge amounts > of text and N became large. Then again, it might not.I think that the larger the body of input text, the larger you want N to be, but it probably maxes out somewhere at (uh) 1/2 * length-of-a-typical-sentence or something like that. (Or maybe it's "verb phrase" and not sentence? I'll bet there's some language-specific constant about how many words our brains need in order to pull patterns out of groups of words. I'll bet it's not a very large number.)