Koop 10 boeken voor 10 € hier!
Bookbot

Giovanni Sommaruga

    Aspects et problèmes du conventionnalisme
    Formal theories of information
    Turing’s Revolution
    History and Philosophy of Constructive Type Theory
    • The book offers an in-depth exploration of Martin-Löf's constructive type theory, highlighting its evolution from 1970 to 1995 through eight distinct stages. Sommaruga meticulously surveys the prehistory and complex development of type theory, presenting a comprehensive account of its latest version as introduced by Martin-Löf in 1993. Unlike previous brief presentations, this work addresses critical issues related to type theory's implications for logic and the foundations of mathematics, dedicating a significant section to these topics.

      History and Philosophy of Constructive Type Theory
    • Turing’s Revolution

      The Impact of His Ideas about Computability

      • 329bladzijden
      • 12 uur lezen

      This book provides an overview of the confluence of ideas in Turing’s era and work and examines the impact of his work on mathematical logic and theoretical computer science. It combines contributions by well-known scientists on the history and philosophy of computability theory as well as on generalised Turing computability. By looking at the roots and at the philosophical and technical influence of Turing’s work, it is possible to gather new perspectives and new research topics which might be considered as a continuation of Turing’s working ideas well into the 21st century. The Stored-Program Universal Computer: Did Zuse Anticipate Turing and von Neumann?” is available open access under a Creative Commons Attribution 4.0 International License via link. springer. com

      Turing’s Revolution
    • It is commonly assumed that computers process information, but what exactly is information? Shannon’s information theory provides an initial answer by focusing on measuring the information content of a message, defined as the reduction of uncertainty gained from receiving that message. This uncertainty, rooted in ignorance, is quantified by entropy. The theory has significantly influenced information storage, data compression, transmission, and coding, remaining a vibrant area of research. While Shannon's theory has garnered philosophical interest, it is often critiqued for its "syntactic" focus, overlooking "semantic" aspects. Various philosophical attempts have sought to infuse semantic dimensions into information theory, frequently linking back to Shannon’s work. Additionally, semantic information theory often employs formal logic, connecting information to reasoning, deduction, inference, and decision-making. Entropy and related measures have also proven vital in statistical inference, as statistical data and observations convey information about unknown parameters. Consequently, a distinct branch of statistics has emerged, rooted in concepts from Shannon’s theory, including measurements like Fisher’s information.

      Formal theories of information