## page was renamed from HlpLab/Labmeeting/Su08w1 #acl HlpLabGroup:read,write,delete,revert,admin All:read #format wiki #pragma section-numbers 3 #language en ''I won't actually be at this meeting, as I'm going to my brother's college graduation. I'll be making notes on this page that will hopefully help to get discussions started, but it would be great if someone wanted to volunteer to lead the meeting.'' -- AustinFrank [[DateTime(2008-05-14T23:36:17Z)]] We're going to the original source for InformationTheory. This serves the joint purpose of a) providing some background for UniformInformationDensity, and b) giving us all a good starting point for making connections to InformationTheory in our StatsMiniCourse. = Required reading = The foundational paper in InformationTheory is [attachment:shannon-48.pdf Shannon '48]. Read at least through page 28 of this PDF file. Pay special attention to Part II, starting on page 19. = Other sources = * An application of the ideas in the '48 paper is in [attachment:shannon-51.pdf Shannon '51], "Prediction and Entropy of Printed English". * Shannon's most recent paper on the topic that I could find is [attachment:shannon-84.pdf Shannon '84]. It discusses optimal transmission rates under different noise conditions. * [http://lccn.loc.gov/63016192 Abramson '63] is apparently a good text that is now out of print. * [http://lccn.loc.gov/97014382 Kullback '97] is a reprinting of an early text on connections between statistics and InformationTheory.