Informatics
In an age dominated by digital technologies and rapid information exchange, the field of informatics stands as a cornerstone, weaving a tapestry that connects computing, data, and human interaction.
What is informatics?
The word 'informatics' means automatic information and was used for the first time around the 1950s. It's the science behind data manipulation, automation, and fast, efficient and complex computing, born as an extension of the human brain natural abilities. Informatics appeared as a new field when computing machines began to be complex and programmable: whereas the first computing machine could perform just one task and nothing else, modern ones have many features like computational speed, concurrency and multitasking that enable them to be adapted to the user needs.
A step back in the history of informatics

The origins of informatics can be traced back to the 19th century with the development of early computing machines such as Charles Babbage's Analytical Engine (1837) and Ada Lovelace's pioneering work on programming concepts.
In the 20th century, Claude Shannon's groundbreaking work on information theory laid the foundation for the mathematical treatment of information and communication. His seminal paper "A Mathematical Theory of Communication" (1948) is considered a cornerstone in the field.
The mid-20th century witnessed the emergence of computer science as a distinct discipline. Alan Turing's conceptualization of the Turing machine and the development of the first electronic computers, such as the ENIAC (Electronic Numerical Integrator and Computer) in the 1940s, marked a transformative era.
The 1960s saw the development of the first database management systems (DBMS), such as IBM's IMS (Information Management System). These systems laid the groundwork for organized and efficient storage and retrieval of information.
The late 20th century brought about the advent of the internet, revolutionizing global communication and information exchange. Tim Berners-Lee's creation of the World Wide Web in 1989 further accelerated the accessibility of information.
In the 21st century, the explosion of digital data led to the emergence of data science as a crucial aspect of informatics. The ability to analyze vast datasets became instrumental in deriving meaningful insights and driving decision-making processes, leading to the birth and affirmation of artificial intelligence. Now, it seems that everything goes in that direction, fulfilling, in the end, the real meaning behind the word 'informatics': a branch of technology-oriented to automate every human action, from the mechanical one to all those involving brain activities.
Fields of application
Informatics has many application fields and for this reason, it is very important. You can develop specific programming skills that will be useful for some of them but useless for others since, nowadays, specialisation is mandatory. Just to give you an idea, below I report some branches, very different from each other, where informatics and programming have become crucial.
In scientific research, informatics is instrumental for data analysis, simulations, and modelling. It accelerates the pace of discovery and innovation across various scientific disciplines. Informatics is the basis behind every tool or software developed across the world for scientific purposes: MATLAB, Simulink, AWR, Cadence or Synopsys EDA and CAD environments, CATIA and many many others.
In healthcare, informatics is used to manage patient records, streamline communication among healthcare professionals, and analyze medical data for research and diagnosis. Electronic health records (EHRs) have become a standard tool for healthcare providers.
Informatics is essential for managing business operations, analyzing market trends, and making strategic decisions. Financial institutions rely on informatics to handle transactions, manage risk, and detect fraudulent activities.
The development of communication technologies, including the internet and mobile networks, is rooted in informatics. It enables seamless global connectivity and information exchange.
How to begin?
I began learning the basics of programming when I was still a child, self-teaching every step of the way. My family has never been so technologically advanced so I didn't have, for many years, a peer to confront with. The first thing I learned was HTML, but I quickly realized that markup and web design weren't so interesting, so I moved to Python2.x and C. I wrote my first programs and studied system and socket programming since I was attracted by how a computing machine operates and communicates with other ones over a network. I went on by tinkering for many years, but now, I can say that I'm still learning in the same way, without any specific method, treating programming just as a set of soft skills: I'm using scripting and programming languages just to perform post-processing analysis on experimental data, automate long and boring operations, running simulations with minimal manual effort.
However, if you want to begin, I would suggest you start exactly as I did: Python3 and C. The first one is very simple, and straightforward and helps you to focus on typical programming aspects like loops, recursion and data manipulation. Since Python eases the burden of low-level programming, leaving all the machine-related operations to an interpreter, I would warmly suggest you compensate with C: this should fill all the deficiencies that high-level programmers have, making stronger your knowledge about memory management, and code execution and computer architecture.
Comments
Please, remember to always be polite and respectful in the comments section. In case of doubts, read this before posting.
Posted comments ⮧
Comment section still empty.
INDEX
INFO


STATISTICS

CONTACTS
SHARE