- Zeitschriftendaten
- Format
- Zeitschrift
- eISSN
- 2199-6059
- ISSN
- 0860-150X
- Erstveröffentlichung
- 08 Aug 2013
- Erscheinungsweise
- 4 Hefte pro Jahr
- Sprachen
- Englisch
Suche
- Open Access
What Logic and Grammar Bring to the Issue of Solvability? Meditation in this Journal’s 40th Anniversary
Seitenbereich: 7 - 15
Zusammenfassung
- Open Access
Introduction to the Issue: Contemporary Philosophy of Informatics
Seitenbereich: 17 - 18
Zusammenfassung
Zusammenfassung
The core entities of computer science include formal languages, spec-ifications, models, programs, implementations, semantic theories, type inference systems, abstract and physical machines. While there are conceptual questions concerning their nature, and in particular ontological ones (Turner 2018), our main focus here will be on the relationships between them. These relationships have an
Schlüsselwörter
- intention
- specification
- correctness
- verification
- Open Access
Is Church’s Thesis Still Relevant?
Seitenbereich: 31 - 51
Zusammenfassung
The article analyses the role of Church’s Thesis (hereinafter CT) in the context of the development of hypercomputation research. The text begins by presenting various views on the essence of computer science and the limitations of its methods. Then CT and its importance in determining the limits of methods used by computer science is presented. Basing on the above explanations, the work goes on to characterize various proposals of hypercomputation showing their relative power in relation to the arithmetic hierarchy.
The general theme of the article is the analysis of mutual relations between the content of CT and the theories of hypercomputation. In the main part of the paper the arguments for abolition of CT caused by the introduction of hypercomputable methods in computer science are presented and critique of these views is presented. The role of the efficiency condition contained in the formulation of CT is stressed. The discussion ends with a summary defending the current status of Church’s thesis within the framework of philosophy and computer science as an important point of reference for determining what the notion of effective calculability really is. The considerations included in this article seem to be quite up-to-date relative to the current state of affairs in computer science.1
Schlüsselwörter
- hypercomputation
- Church’s thesis
- effective computations
- computer science & philosophy
- Open Access
Implicit and Explicit Examples of the Phenomenon of Deviant Encodings
Seitenbereich: 53 - 67
Zusammenfassung
The core of the problem discussed in this paper is the following: the Church-Turing Thesis states that Turing Machines formally explicate the intuitive concept of computability. The description of Turing Machines requires description of the notation used for the
Deviant encodings appear
Schlüsselwörter
- concept of computation the concept of natural number deviant encoding computational structuralism conceptual engineering explications the Church-Turing thesis
- Open Access
Analogicity in Computer Science. Methodological Analysis
Seitenbereich: 69 - 86
Zusammenfassung
Analogicity in computer science is understood in two, not mutually exclusive ways: 1) with regard to the continuity feature (of data or computations), 2) with regard to the analogousness feature (i.e. similarity between certain natural processes and computations). Continuous computations are the subject of three methodological questions considered in the paper: 1a) to what extent do their theoretical models go beyond the model of the universal Turing machine (defining digital computations), 1b) is their computational power greater than that of the universal Turing machine, 1c) under what conditions are continuous computations realizable in practice? The analogue-analogical computations lead to two other issues: 2a) in what sense and to what extent their accuracy depends on the adequacy of certain theories of empirical sciences, 2b) are there analogue-analogical computations in nature that are also continuous? The above issues are an important element of the philosophical discussion on the limitations of contemporary computer science.
Schlüsselwörter
- computer science
- methodology of computer science
- analogicity
- analogue computations
- analogousness
- natural computations
- hypercomputations
- Open Access
Algorithmicity of Evolutionary Algorithms
Seitenbereich: 87 - 100
Zusammenfassung
In the first part of our article we will refer the penetration of scientific terms into colloquial language, focusing on the sense in which the concept of an algorithm currently functions outside its original scope. The given examples will refer mostly to disciplines not directly related to computer science and to the colloquial language. In the next part we will also discuss the modifications made to the meaning of the term algorithm and how this concept is now understood in computer science. Finally, we will discuss the problem of algorithmicity of evolutionary algorithms, i.e. we will try to answer the question whether – and to what extent – they are still algorithms in the classical sense.
Schlüsselwörter
- algorithm
- algorithmicity
- evolutionary algorithms
- scientific term
- philosophy of informatics
- Open Access
The Intuitive Concept of Information: An Analysis
Seitenbereich: 101 - 119
Zusammenfassung
This paper seeks to determine the intuitive meaning of the concept of information by indicating its essential (definitional) features and relations with other concepts, such as that of knowledge. The term “information” – as with many other concepts, such as “process”, “force”, “energy” and “matter” – has a certain established meaning in natural languages, which allows it to be used, in science as well as in everyday life, without our possessing any somewhat stricter definition of it. The basic aim here is thus to explicate what it amounts to in the context of its intuitive meaning as encountered in natural languages, what the subject of cognition implicitly presumes when using the term, and to which ontological situations it can be applied. I demonstrate that the essential features of the notion of information include the presence of a material medium, its transformation, the recording and reading of information encoded in the medium, and the grasp of what is recorded, coded and transmitted as an intentional object, where the latter is construed in terms broadly in line with the ontologies of Husserl and Ingarden. Along the way, a number of issues relating to the notion of information are also pointed out: the problem of informational identity, of the existence of virtual objects, and of the choice of an adequate information carrier, as well as formal-ontological problems, including those which concern relations between information carriers and intentional objects.
Schlüsselwörter
- information
- definition of the concept of information
- knowledge
- intentional object
- reconstruction of the hermeneutical horizon
- intuitive analysis of concepts
- phenomenology
- Open Access
Heuristic Potential of Antoni Kępiński’s Information Metabolism Model and Data Smog
Seitenbereich: 121 - 139
Zusammenfassung
In this paper I present the explanatory potential of Antoni Kępiński’s model of information metabolism in the question of data smog (data glut, information overload). Kępiński’s model is not well known and the bibliography concerning the model of information metabolism is still rather poor. In the article I present the model as a good heuristics for explaining information exchange between the system and the environment. The particular aim of my deliberations is to use the model of information metabolism to discuss the pathological, although common phenomenon of data smog. This allows for a holistic view of the problem and allows for new statements on data smog, ethical consequences and counteracting the negative consequences of information overload.
Schlüsselwörter
- Antoni Kępiński
- information metabolism
- information
- data smog
- data glut
- information overload.
- Open Access
Value-Sensitive Co-Design for Resilient Information Systems
Seitenbereich: 141 - 164
Zusammenfassung
In Information Systems development, resilience has often been treated as a non-functional requirement and little or no work is aimed at building resilience in end-users through systems development. The question of how values and resilience (for the end-user) can be incorporated into the design of systems is an on-going research activity in user-centered design. In this paper we evaluate the relation of values and resilience within the context of an ongoing software development project and contribute a formal model of co-design based on a significant extension of Abstract Design Theory. The formal analysis provides a full and clear-cut definition of the co-design space, its objectives and processes. On the basis of both, we provide an abstract definition of resilient system (for the end-user). We conclude that value-sensitive co-design enforces better resilience in end-users.
Schlüsselwörter
- Abstract Design Theory
- Value Sensitive Design
- Co-Design
- Resilience
- Open Access
Phronetic Ethics in Social Robotics: A New Approach to Building Ethical Robots
Seitenbereich: 165 - 183
Zusammenfassung
Social robotics are autonomous robots or Artificial Moral Agents (AMA), that will interact respect and embody human ethical values. However, the conceptual and practical problems of building such systems have not yet been resolved, playing a role of significant challenge for computational modeling. It seems that the lack of success in constructing robots,
Schlüsselwörter
- phronetic ethics
- ethical robots
- Aristotelian ethics
- artificial moral agents
- socially integrated autonomous machines
- roboethics
- machine ethics
- autonomous ethical systems
- phronesis
- ethical robots
- ethical ascend
- Open Access
Data-Driven Dialogue Models: Applying Formal and Computational Tools to the Study of Financial And Moral Dialogues
Seitenbereich: 185 - 208
Zusammenfassung
This paper proposes two formal models for understanding real-life dialogues, aimed at capturing argumentative structures performatively enacted during conversations. In the course of the investigation, two types of discourse with a high degree of well-structured argumentation were chosen: moral debate and financial communication. The research project found itself confronted by a need to analyse, structure and formally describe large volumes of textual data, where this called for the application of computational tools. It is expected that the results of the proposed research will make a contribution to formal systems modelling and the evaluation of communication from the point of view of argument soundness.
Schlüsselwörter
- data driven dialogue systems
- corpus studies
- computational tools in argument analysis
- Open Access
The Philosophy of Expertise in the Age of Medical Informatics: How Healthcare Technology is Transforming Our Understanding of Expertise and Expert Knowledge?
Seitenbereich: 209 - 225
Zusammenfassung
The unprecedented development of medical informatics is constantly transforming the concept of expertise in medical sciences in a way that has far-reaching consequences for both the theory of knowledge and the philosophy of informatics. Deep medicine is based on the assumption that medical diagnosis should take into account the wide array of possible health factors involved in the diagnostic process, such as not only genome analysis alone, but also the metabolome (analysis of all body metabolites important for e.g. drug-drug interactions), microbiome (i.e. analysis of all bodily microorganisms interacting with host cells) or exposome (analysis of all environmental factors triggering potentially harmful cell mutations, such as UV radiation, heavy metals, various carcinogens, etc.). Deep data analysis is of tantamount importance for personalized diagnosis but, at the same time, hardly achievable by a regular human being. However, adequately designed artificial intelligence (e.g. a deep neural network) can undeniably be of great help for finding correlations between symptoms and underlying diseases. Nowadays AI applies to nearly every aspect of medicine, starting from the data analysis of scientific literature, through the diagnostic process, to the act of communication between physicians and their patients. Medical image processing algorithms greatly enhance the chances of successful recognition of melanoma or intestinal polyps. Communication tools designed for physicians use techniques known from social media to provide users with an opportunity to consult the case with colleagues from the same discipline. Natural language processing tools significantly improve doctor-patient communication by the automation of note-taking. Is this enough to support the claim that the non-epistemic competences in medicine are becoming more and more important? Can we attribute expertise only to a person? How is medical informatics changing the way most people usually understand human-computer interactions?
Schlüsselwörter
- philosophy of expertise
- expert knowledge
- philosophy of medical informatics
- computers in medicine