The general theory of stochastic processes and the more specialized theory of Markov processes evolved enormously in the second half of the last century. In parallel, the theory of controlled Markov chains (or Markov decision processes) was being pioneered by control engineers and operations researchers. Researchers in Markov processes and controlled Markov chains have been, for a long time, aware of the synergies between these two subject areas. However, this may be the first volume dedicated to highlighting these synergies and, almost certainly, it is the first volume that emphasizes the contributions of the vibrant and growing Chinese school of probability. The chapters that appear in this book reflect both the maturity and the vitality of modern day Markov processes and controlled Markov chains. They also will provide an opportunity to trace the connections that have emerged between the work done by members of the Chinese school of probability and the work done by the European, US, Central and South American and Asian scholars.
Le informazioni nella sezione "Riassunto" possono far riferimento a edizioni diverse di questo titolo.
Preface. Part I: Markov processes. 1. Branching exit Markov system and their applications to partial differential equations; E.B. Dynkin. 2. Feller transition functions, resolvent decomposition theorems, and their application in unstable denumerable Markov processes; A. Chen, et al. 3. Identifying Q-processes with a given finite mu-invariant measure; P.K. Pollett. 4. Convergence property of standard transition functions; H. Zhang, et al. 5. Markov skeleton processes; H. Zhenting, et al. 6. Piecewise deterministic Markov processes and semi-dynamic systems; G. Liu. Part II: Controlled Markov chains and decision processes. 7. Average optimality for adaptive Markov control processes with unbounded costs and unknown disturbance distribution; J.A. Minjárez-Sosa. 8. Controlled Markov chains with utility functions; S. Iwamoto, et al. 9. Classification problems in MDPs; L.C.M. Kallenberg. 10. Optimality conditions for CTMDP with average cost criterion; X. Guo, W. Zhu. 11. Optimal and nearly optimal policies in Markov decision chains with nonnegative rewards and risk-sensitive expected total-reward criterion; R. Cavazos-Cadena, R. Montes-de-Oca. 12. Interval methods for uncertain Markov decision processes; M. Kurano, et al. 13. Constrained discounted semi-Markov decision processes; E.A. Feinberg. 14. Linear program for communication MDPs with multiple constraints; J.A. Filar, X. Guo. 15. Optimal switching problem for Markov chains; A.A. Yushkevich. 16.Approximations of a controlled diffusion model for renewable resource exploitation; S. Pasquali, W.J. Runggaldier. Part III: Stochastic processes and martingales. 17. A Fleming-Viot process with unbounded selection, II; S.N. Ethier, T. Shiga. 18. Boundary theory for superdiffusions; S.E. Kuznetsov. 19. On solutions of backward stochastic differential equations with jumps and stochastic control; S. Rong. 20. Doob's inequality and lower estimation of the maximum of martingales; L. Zhichan. 21. The Hausdorff measure of the level sets of Brownian motion on the Sierpinski carpet; Y. Chenggui, C. Xuerong. 22. Monotonic approximation of the Gittins index; X. Wang. Part IV: Applications to finance, control systems an other related fields. 23. Optimal consumption-investment decisions allowing for bankruptcy: A brief survey; S.P. Sethi. 24. The hedging strategy of an Asian option; Z. Yang, J. Zou. 25. The pricing of options to exchange one asset for another; C. Chen, et al. 26. Finite horizon portfolio risk models with probability criterion; Y. Lin, et al. 27. Long term average control of a local time process; M.S. Mendiondo, R.H. Stockbridge. 28. Singularly perturbed hybrid control systems approximated by structured linear programs; A. Haurie, et al. 29. The effect of stochastic disturbance on the solitary waves; J. Li, et al. 30. Independent candidate for Tierney model of H-M algorithms; P. Chen. 31. How rates of convergence for Gibbs fields depend on the interaction and the kinds of scanning used;
Le informazioni nella sezione "Su questo libro" possono far riferimento a edizioni diverse di questo titolo.
EUR 17,52 per la spedizione da Regno Unito a Italia
Destinazione, tempi e costiGRATIS per la spedizione da U.S.A. a Italia
Destinazione, tempi e costiDa: Romtrade Corp., STERLING HEIGHTS, MI, U.S.A.
Condizione: New. This is a Brand-new US Edition. This Item may be shipped from US or any other country as we have multiple locations worldwide. Codice articolo ABNR-88052
Quantità: 1 disponibili
Da: Basi6 International, Irving, TX, U.S.A.
Condizione: Brand New. New. US edition. Expediting shipping for all USA and Europe orders excluding PO Box. Excellent Customer Service. Codice articolo ABEJUNE24-165945
Quantità: 1 disponibili
Da: Ria Christie Collections, Uxbridge, Regno Unito
Condizione: New. In. Codice articolo ria9781402008030_new
Quantità: Più di 20 disponibili
Da: moluna, Greven, Germania
Gebunden. Condizione: New. The general theory of stochastic processes and the more specialized theory of Markov processes evolved enormously in the second half of the last century. In parallel, the theory of controlled Markov chains (or Markov decision processes) was being pioneer. Codice articolo 458472590
Quantità: Più di 20 disponibili
Da: GreatBookPricesUK, Woodford Green, Regno Unito
Condizione: New. Codice articolo 947460-n
Quantità: Più di 20 disponibili
Da: GreatBookPrices, Columbia, MD, U.S.A.
Condizione: New. Codice articolo 947460-n
Quantità: Più di 20 disponibili
Da: THE SAINT BOOKSTORE, Southport, Regno Unito
Hardback. Condizione: New. This item is printed on demand. New copy - Usually dispatched within 5-9 working days 983. Codice articolo C9781402008030
Quantità: Più di 20 disponibili
Da: Books Puddle, New York, NY, U.S.A.
Condizione: New. pp. 524. Codice articolo 263101156
Quantità: 4 disponibili
Da: Lucky's Textbooks, Dallas, TX, U.S.A.
Condizione: New. Codice articolo ABLIING23Mar2411530141308
Quantità: Più di 20 disponibili
Da: Majestic Books, Hounslow, Regno Unito
Condizione: New. Print on Demand pp. 524 Illus. Codice articolo 5828155
Quantità: 4 disponibili