读书人

概率论沉思录(英文版)

发布时间: 2010-08-01 03:00:13 作者: kind887

 概率论沉思录(英文版)


基本信息出版社:人民邮电出版社
页码:727 页
出版日期:2009年04月
ISBN:7115195366/9787115195364
条形码:9787115195364
版本:第1版
装帧:平装
开本:16
正文语种:英语
丛书名:图灵原版数学统计学系列
外文书名:Probability theory:the logic of science

内容简介 《概率论沉思录(英文版)》将概率和统计推断融合在一起,用新的观点生动地描述了概率论在物理学、数学、经济学、化学和生物学等领域中的广泛应用,尤其是它阐述了贝叶斯理论的丰富应用,弥补了其他概率和统计教材的不足。全书分为两大部分。第一部分包括10章内容,讲解抽样理论、假设检验、参数估计等概率论的原理及其初等应用;第二部分包括12章内容,讲解概率论的高级应用,如在物理测量、通信理论中的应用。《概率论沉思录(英文版)》还附有大量习题,内容全面,体例完整。
《概率论沉思录(英文版)》内容不局限于某一特定领域,适合涉及数据分析的各领域工作者阅读,也可作为高年级本科生和研究生相关课程的教材。
作者简介 E.T.Jaynes(1922—1998)已故著名数学家和物理学家。生前曾任华盛顿大学圣路易斯分校和斯坦福大学教授。他因为提出了热动力学的最大熵原理(1957年)和量子光学的Jaynes-Cummings/模型(1963年)而闻名于世。此后的几十年,他一直在探求将概率和统计推断作为整个科学的逻辑基础这一重大课题,其成果和心得最终凝结为本书。
媒体推荐 “这是几十年来最重要的一部概率论著作。它解决了许多长期困扰我的问题。概率、统计、
模式识别、数据分析、机器学习、数据挖掘……只要你的工作涉及不完全和不确定信息的处理,
就应该仔细研读本书。它将大大改变你思考问题的方式。”
——KevIn S.Van Horn,资深计算机技术和概率统诗专家
“本书广受欢迎。读者会在书中发现很多引人深思的内容,不仅涉及日常实践,更深人统骨
和概率理论本身。无论对于统计学者还是各应用领域的科技工作者,本书都是必读之作。”
——美国《数学评论》
“这不是一本普通的教材。它全面、彻底地阐述了统计中的贝叶斯方法。书中有上百个例子,足够让你透彻理解其中的理论和应用。每个对统计问题或统计应用感兴趣的人都应该仔细研读。”
——sIAM News
编辑推荐 《概率论沉思录(英文版)》将概率和统计推断融合起来,用新颖的观点生动地描述了概率论/贝叶斯理论在物理学、数学、化学、生物学、经济学和社会学等领域中的广泛应用,弥补了其他概率和统计教材的不足。不仅适合概率和统计专业人士阅读。也是需要应用统计推断的各领域科技工作者的必读之作。
这是一部奇书。它是著名数学物理学家Jaynes的遗作,凝聚了他对概率论长达40年的深刻思考。原版出版后产生了巨大影响,深受众多专家和学者的好评,并获得Amazon网上书店读者全五星评价。 在书中,作者在H.Jeffreys、R.T.Cox、C.E.Shannon和G.Polya等数学大师思想的基础上继续探索,将概率论置于更大的背景下考察,提出将概率推断作为整个科学的逻辑基础,以适应实际科学研究中对象往往都是信息不完全或者不确定的这一难题,从而超越了传统的概率论,也超越了传统的数理逻辑思维定式。
目录
PartI Principlesandelementaryapplications
1 Plausiblereasoning
1.1 Deductiveandplausiblereasoning
1.2 Analogieswith slcaltheories
1.3 Thethinkingcomputer
1.4 Introducingtherobot
1.5 Booleanalgebra
1.6 Adequatesetsofoperations
1.7 Thebasicdesiderata
1.8 Comments
1.8.1 Commonlanguagevs.formallogic
1.8.2 Nitpicking

2 Thequantitativerules
2.1 Theproductrule
2.2 Thesumrule
2.3 Qualitativeproperties
2.4 Numericalvalues
2.5 Notationandfinite-setspolicy
2.6 Comments
2.6.1 'Suectlve'vs.'oectlve'
2.6.2 G/3del'stheorem
2.6.3 Venndiagrams
2.6.4 The'Kolmogorovaxioms'

3 Elementarysamplingtheory
3.1 Samplingwithoutreplacement
3.2 Logicvs.propensity
3.3 Reasoningfromlesspreciseinformation
3.4 Expectations
3.5 Otherformsandextensions
3.6 Probabilityasamathematicaltool
3.7 Thebinomialdistribution
3.8 Samplingwithreplacement
3.8.1 Digression:asermononrealityvs.models
3.9 Correctionforcorrelations
3.10 Simplification
3.11 Comments
3.11.1 Alookahead

4 Elementaryhypothesistesting
4.1 Priorprobabilities
4.2 Testingbinaryhypotheseswithbinarydata
4.3 Nonextensibilitybeyondthebinarycase
4.4 Multiplehypothesistesting
4.4.1 Digressiononanotherderivation
4.5 Continuousprobabilitydistributionfunctions
4.6 Testinganinfinitenumberofhypotheses
4.6.1 Historicaldigression
4.7 Simpleandcompound(orcomposite)hypotheses
4.8 Comments
4.8.1 Etymology
4.8.2 Whathaveweaccomplished?

5 Queerusesforprobabilitytheory
5.1 Extrasensoryperception
5.2 MrsStewart'stelepathicpowers
5.2.1 Digressiononthenormalapproximation
5.2.2 BacktoMrsStewart
5.3 Converginganddivergingviews
5.4 Visualperception-evolutionintoBayesianity?
5.5 ThediscoveryofNeptune
5.5.1 Digressiononalternativehypotheses
5.5.2 BacktoNewton
5.6 Horseracingandweatherforecasting
5.6.1 Discussion
5.7 Paradoxesofintuition
5.8 Bayesianjurisprudence
5.9 Comments
5.9.1 Whatisqueer?

6 Elementaryparameterestimation
6.1 Inversionoftheumdistributions
6.2 BothNandRunknown
6.3 Uniformprior
6.4 Predictivedistributions
6.5 Truncateduniformpriors
6.6 Aconcaveprior
6.7 Thebinomialmonkeyprior
6.8 Metamorphosisintocontinuousparameterestimation
6.9 Estimationwithabinomialsamplingdistribution
6.9.1 Digressiononoptionalstopping
6.10 Compoundestimationproblems
6.11 AsimpleBayesianestimate:quantitativepriorinformation
6.11.1 Fromposteriordistributionfunctiontoestimate
6.12 Effectsofqualitativepriorinformation
6.13 Choiceofaprior
6.14 Onwiththecalculation!
6.15 TheJeffreysprior
6.16 Thepointofitall
6.17 Intervalestimation
6.18 Calculationofvariance
6.19 Generalizationandasymptoticforms
6.20 Rectangularsamplingdistribution
6.21 Smallsamples
6.22 Mathematicaltrickery
6.23 Comments

7 Thecentral,Gaussianornormaldistribution
7.1 Thegravitatingphenomenon
7.2 TheHerschel-Maxwellderivation
7.3 TheGaussderivation
7.4 HistoricalimportanceofGauss'sresult
7.5 TheLandonderivation
7.6 WhytheubiquitoususeofGausslandistributions?
7.7 Whytheubiquitoussuccess?
7.8 Whatestimatorshouldweuse?
7.9 Errorcancellation
7.10 Thenearirrelevanceofsamplingfrequencydistributions
7.11 Theremarkableefficiencyofinformationtransfer
7.12 Othersamplingdistributions
7.13 Nuisanceparametersassafetydevices
7.14 Moregeneralproperties
7.15 ConvolutionofGaussians
7.16 Thecentrallimittheorem
7.17 Accuracyofcomputations
7.18 Galton'sdiscovery
7.19 PopulationdynamicsandDarwinianevolution
7.20 Evolutionofhumming-birdsandflowers
7.21 Applicationtoeconomics
7.22 ThegreatinequalityofJupiterandSaturn
7.23 ResolutionofdistributionsintoGaussians
7.24 Hermitepolynomialsolutions
7.25 Fouriertransformrelations
7.26 Thereishopeafterall
7.27 Comments
7.27.1 Terminologyagain

8 Sufficiency,ancillarity,andallthat
8.1 Sufficiency
8.2 Fishersufficiency
8.2.1 Examples
8.2.2 TheBlackwell-Raotheorem
8.3 Generalizedsufficiency
8.4 Sufficiencyplusnuisanceparameters
8.5 Thelikelihoodprinciple
8.6 Ancillarity
8.7 Generalizedancillaryinformation
8.8 Asymptoticlikelihood:Fisherinformation
8.9 Combiningevidencefromdifferentsources
8.10 Poolingthedata
8.10.1 Fine-grainedpropositions
8.11 Sam'sbrokenthermometer
8.12 Comments
8.12.1 Thefallacyofsamplere-use
8.12.2 Afolktheorem
8.12.3 Effectofpriorinformation
8.12.4 Clevertricksandgamesmanship

9 Repetitiveexperiments:probabilityandfrequency
9.1 Physicalexperiments
9.2 Thepoorlyinformedrobot
9.3 Induction
9.4 Aretheregeneralinductiverules?
9.5 Multiplicityfactors
9.6 Partitionfunctionalgorithms
9.6.1 Solutionbyinspection
9.7 Entropyalgorithms
9.8 Anotherwayoflookingatit
9.9 Entropymaximization
9.10 Probabilityandfrequency
9.11 Significancetests
9.11.1 Impliedalternatives
9.12 Comparisonofpsiandchi-squared
9.13 Thechi-squaredtest
9.14 Generalization
9.15 Halley'smortalitytable
9.16 Comments
9.16.1 Theirrationalists
9.16.2 Superstitions
10 Physicsof'randomexperiments'
10.1 Aninterestingcorrelation
10.2 Historicalbackground
10.3 Howtocheatatcoinanddietossing
10.3.1 Experimentalevidence
10.4 Bridgehands
10.5 Generalrandomexperiments
10.6 Inductionrevisited
10.7 Butwhataboutquantumtheory?
10.8 Mechanicsundertheclouds
10.9 Moreoncoinsandsymmetry
10.10 Independenceoftosses
10.11 Thearroganceoftheuninformed

PartⅡ Advancedapplications
11 Discretepriorprobabilities:theentropyprinciple
11.1 Anewkindofpriorinformation
11.2 Minimum∑Pi2
11.3 Entropy:Shannon'stheorem
11.4 TheWallisderivation
11.5 Anexample
11.6 Generalization:amorerigorousproof
11.7 Formalpropertiesofmaximumentropydistributions
11.8 Conceptualproblems-frequencycorrespondence
11.9 Comments

12 Ignorancepriorsandtransformationgroups
12.1 Whatarewetryingtodo?
12.2 Ignorancepriors
12.3 Continuousdistributions
12.4 Transformationgroups
12.4.1 Locationandscaleparameters
12.4.2 APoissonrate
12.4.3 Unknownprobabilityforsuccess
12.4.4 Bertrand'sproblem
12.5 Comments

13 Decisiontheory,historicalbackground
13.1 Inferencevs.decision
13.2 DanielBernoulli'ssuggestion
13.3 Therationaleofinsurance
13.4 Entropyandutility
13.5 Thehonestweatherman
13.6 ReactionstoDanielBernoulliandLaplace
13.7 Wald'sdecisiontheory
13.8 Parameterestimationforminimumloss
13.9 Reformulationoftheproblem
13.10 Effectofvaryinglossfunctions
13.11 Generaldecisiontheory
13.12 Comments
13.12.1 'Objectivity'ofdecisiontheory
13.12.2 Lossfunctionsinhumansociety
13.12.3 AnewlookattheJeffreysprior
13.12.4 Decisiontheoryisnotfundamental
13.12.5 Anotherdimension?

14 Simpleapplicationsofdecisiontheory
14.1 Definitionsandpreliminaries
14.2 Sufficiencyandinformation
14.3 Lossfunctionsandcriteriaofoptimumperformance
14.4 Adiscreteexample
14.5 Howwouldourrobotdoit?
14.6 Historicalremarks
14.6.1 Theclassicalmatchedfilter
14.7 Thewidgetproblem
14.7.1 SolutionforStage2
14.7.2 SolutionforStage3
14.7.3 SolutionforStage4
14.8 Comments

15 Paradoxesofprobabilitytheory
15.1 Howdoparadoxessurviveandgrow?
15.2 Summingaseriestheeasyway
15.3 Nonconglomerability
15.4 Thetumblingtetrahedra
15.5 Solutionforafinitenumberoftosses
15.6 Finitevs.countableadditivity
15.7 TheBorel-Kolmogorovparadox
15.8 Themarginalizationparadox
15.8.1 Ontogreaterdisasters
15.9 Discussion
15.9.1 TheDSZExample#5
15.9.2 Summary
15.10 Ausefulresultafterall?
15.11 Howtomass-produceparadoxes
15.12 Comments

16 Orthodoxmethods:historicalbackground
16.1 Theearlyproblems
16.2 Sociologyoforthodoxstatistics
16.3 RonaldFisher,HaroldJeffreys,andJerzyNeyman
16.4 Pre-dataandpost-dataconsiderations
16.5 Thesamplingdistributionforanestimator
16.6 Pro-causalandanti-causalbias
16.7 Whatisreal,theprobabilityorthephenomenon?
16.8 Comments
16.8.1 Communicationdifficulties

17 Principlesandpathologyoforthodoxstatistics
17.1 Informationloss
17.2 Unbiasedestimators
17.3 Pathologyofanunbiasedestimate
17.4 Thefundamentalinequalityofthesamplingvariance
17.5 Periodicity:theweatherinCentralPark
17.5.1 Thefollyofpre-filteringdata
17.6. ABayesiananalysis
17.7 Thefollyofrandomization
17.8 Fisher:commonsenseatRothamsted
17.8.1 TheBayesiansafetydevice
17.9 Missingdata
17.10 Trendandseasonalityintimeseries
17.10.1 Orthodoxmethods
17.10.2 TheBayesianmethod
17.10.3 ComparisonofBayesianandorthodoxestimates
17.10.4 Animprovedorthodoxestimate
17.10.5 Theorthodoxcriterionofperformance
17.11 Thegeneralcase
17.12 Comments

18 TheApdistributionandruleofsuccession
18.1 Memorystorageforoldrobots
18.2 Relevance
18.3 Asurprisingconsequence
18.4 Outerandinnerrobots
18.5 Anapplication
18.6 Laplace'sruleofsuccession
18.7 Jeffreys'objection
18.8 Bassorcarp?
18.9 Sowheredoesthisleavetherule?
18.10 Generalization
18.11 Confirmationandweightofevidence
18.11.1 Isindifferencebasedonknowledgeorignorance?
18.12 Camap'sinductivemethods
18.13 Probabilityandfrequencyinexchangeablesequences
18.14 Predictionoffrequencies
18.15 One-dimensionalneutronmultiplication
18.15.1 Thefrequentistsolution
18.15.2 TheLaplacesolution
18.16 ThedeFinettitheorem
18.17 Comments

19 Physicalmeasurements
19.1 Reductionofequationsofcondition
19.2 Reformulationasadecisionproblem
19.2.1 SermononGaussianerrordistributions
19.3 Theunderdeterminedcase:Kissingular
19.4 Theoverdeterminedcase:Kcanbemadenonsingular
19.5 Numericalevaluationoftheresult
19.6 Accuracyoftheestimates
19.7 Comments
19.7.1 Aparadox

20 Modelcomparison
20.1 Formulationoftheproblem
20.2 Thefairjudgeandthecruelrealist
20.2.1 Parametersknowninadvance
20.2.2 Parametersunknown
20.3 Butwhereistheideaofsimplicity?
20.4 Anexample:linearresponsemodels
20.4.1 Digression:theoldsermonstillanothertime
20.5 Comments
20.5.1 Finalcauses

21 Outliersandrobustness
21.1 Theexperimenter'sdilemma
21.2 Robustness
21.3 Thetwo-modelmodel
21.4 Exchangeableselection
21.5 ThegeneralBayesiansolution
21.6 Pureoutliers
21.7 Onerecedingdatum

22 Introductiontocommunicationtheory
22.1 Originsofthetheory
22.2 Thenoiselesschannel
22.3 Theinformationsource
22.4 DoestheEnglishlanguagehavestatisticalproperties?
22.5 Optimumencoding:letterfrequenciesknown
22.6 Betterencodingfromknowledgeofdigramfrequencies
22.7 Relationtoastochasticmodel
22.8 Thenoisychannel

AppendixA Otherapproachestoprobabilitytheory
A.1 TheKolmogorovsystemofprobability
A.2 ThedeFinettisystemofprobability
A.3 Comparativeprobability
A.4 Holdoutsagainstuniversalcomparability
A.5 Speculationsaboutlatticetheories

AppendixB Mathematicalformalitiesandstyle
B.1 Notationandlogicalhierarchy
B.2 Our'cautiousapproach'policy
B.3 WillyFelleronmeasuretheory
B.4 Kroneckervs.Weierstrasz
B.5 Whatisalegitimatemathematicalfunction?
B.5.1 Delta-functions
B.5.2 Nondifferentiablefunctions
B.5.3 Bogusnondifferentiablefunctions
B.6 Countinginfinitesets?
B.7 TheHausdorffsphereparadoxandmathematicaldiseases
B.8 WhatamIsupposedtopublish?
B.9 Mathematicalcourtesy

AppendixC Convolutionsandcumulants
C.1 Relationofcumulantsandmoments
……
序言 The following material is addressed to readers who are already familiar with applied math- ematics, at the advanced undergraduate level or preferably higher, and with some field, such as physics, chemistry, biology, geology, medicine, economics, sociology, engineering, operations research, etc., where inference is needed.1 A previous acquaintance with proba- bility and statistics is not necessary; indeed, a certain amount of innocence in this area may be desirable, because there will be less to unlearn.
We are concerned with probability theory and all of its conventional mathematics, but now viewed in a wider context than that of the standard textbooks. Every chapter after the first has 'new' (i.e. not previously published) results that we think will be found interesting and useful. Many of our applications lie outside the scope of conventional probability theory as currently taught. But we think that the results will speak for themselves, and that something like the theory expounded here will become the conventional probability theory of the future.
文摘 插图:


This kind of conceptualizing often leads one to suppose that these distributions represent not just our prior state of knowledge about the data, but the actual long-run variability of the data in such experiments. Clearly, such a belief cannot be justified; anyone who claims to know in advance the long-run results in an experiment that has not been performed is drawing on a vivid imagination, not on any fund of actual knowledge of the phenomenon. Indeed, if that infinite population is only imagined, then it seems that we are free to imagine any population we please.
From a mere act of the imagination we cannot learn anything about the real world. To suppose that the resulting probability assignments have any real physical meaning is just another form of the mind projection fallacy. In practice, this diverts our attention to irrelevancies and away from the things that really matter (such as information about the real world that is not expressible in terms of any sampling distribution, or does not fit into the urn picture, but which is nevertheless highly cogent for the inferences we want to make). Usually, the price paid for this folly is missed opportunities; had we recognized that information, more accurate and/or more reliable inferences could have been made. Urn-type conceptualizing is capable of dealing with only the most primitive kind of information, and really sophisticated applications require us to develop principles that go far beyond the idea of urns. But the situation is quite subtle, because, as we stressed before in connection with Godel's theorem, an erroneous argument does not necessarily lead to a wrong conclusion. In fact, as we shall find in Chapter 9, highly sophisticated calculations sometimes lead us back to urn-type distributions, for purely mathematical reasons that have nothing to do conceptually with urns or populations. The hypergeometric and binomial distributions found
……
读书人网 >科学与自然

热点推荐