You can translate the content of this page by selecting a language in the select box.
What are the top 10 Wonders of computing and software engineering?
Computer science and software engineering are fascinating fields that continue to evolve and surprise us. Computer science and software engineering are disciplines that are essential for the modern world. They have led to the development of many innovative products and services that have made our lives easier and more efficient. In this blog post, we’ll explore the top 10 wonders of computer science and software engineering.
The things Alan Keys found to be astonishing and amazing (and shocking) are:
- Turing’s notion of machines that can simulate machines completely by interpreting their descriptions (exhibiting the programmable computer as “a language machine” and a “meta-language machine” — along with this is the simplicity of what is required to do so (a great book is Marvin Minsky’s “Computation: Finite and Infinite Machines”). Turing’s approach is much more of a “real CS” approach compared to Goedel’s earlier methods, and soon led to a large number of important next steps.
- How simple (a) it is to design a whole computer from just one kind of logical element (e.g. “NOT-BOTH”), especially when compared (b) to how Russell and Whitehead struggled to “bootstrap mathematics, etc., from logic at the turn of the last century. (This is one of those “Point of View is Worth 80 IQ Points” …)
- Lisp, and McCarthy’s general approach to “mathematical theories of computation” and having languages that can act as their own metalanguage. One of the great cornucopias of our field.
- Sketchpad by Ivan Sutherland for so many reasons, including: the approach to interactive computer graphics and the simulations of the graphic relationships, the “object-oriented” approach to definition and deriving new kinds of things (including “masters” and making instances from masters), enormous virtual worlds that are windowed on the display, the use of goal-directed programming with the system solving the simultaneous goals in real-time, etc. And more, including the demonstration that a simulated computer on a computer need look nothing like the underlying hardware or any “normal” idea of “computer”.
- The big Shannon et al. ideas about how to have imperfect things be organized in systems that are much more perfectly behaved even if the organizational mechanisms are themselves noisy. Includes all forms of “noise”, “representations”, “communications”, “machines”, etc. and poking deeply into Biology and how living things work. Nice implications for “stochastic computing” of many kinds which are needed more and more as things scale.
- The deep implications of “symbolic computation” (now a very un-funded area) for being able to move from the trivialities of “data” (no matter how voluminous”) to the profundities and powers of “Meaning”. This used to be called “AI” and now has to be called “real AI” or “strong AI” (it would be much better under a less loaded term: how about “Flexible Competence”?)
- The Internet. Certainly the best thing done by my research community, and the first real essay into the kinds of scaling and stabilities that all computer science should be trying to understand and improve. This was a great invention and development process in all ways, and — by looking at Biology, which inspired but we really couldn’t use — it had a reasonable chance to work. That it was able to scale stably over more than 10 (maybe 11) orders of magnitude, as indeed planned, is still kind of amazing to me (even though it should have). Judging from most software systems today not being organized like the Internet, one is forced into the opinion that most computerists don’t understand it, why it is great (and maybe don’t even think of it as the fruits of “real computer science” because it just works so much better and more reliably than most other attempted artifacts in the field).
- Application: #1: Self-Driving Cars
Self-driving cars are one of the most hyped technologies of the past few years. And for good reason! These autonomous vehicles have the potential to drastically reduce accidents and improve traffic flow. While there are still some kinks to be ironed out, it’s only a matter of time until self-driving cars become the norm.
- Application: #2: Artificial Intelligence
Artificial intelligence is another technology that is rapidly evolving. AI is being used in a variety of ways, from personal assistants like Siri to chatbots that can carry on a conversation. As AI gets more sophisticated, its capabilities will only continue to grow.
- Application: #3: Virtual Reality
Virtual reality is another exciting technology with a lot of potential. VR has already been used in a number of different industries, from gaming to medicine. And as VR technology gets more advanced, we can only imagine the new and innovative ways it will be used in the future.
- Application: #4: Blockchain
You’ve probably heard of Bitcoin, the digital currency that uses blockchain technology. But what exactly is blockchain? In short, it’s a decentralized database that can be used to store data securely. Blockchain is already being used in a number of different industries, and its applications are only growing.
- Application: #5: Internet of Things
The internet of things refers to the growing trend of interconnected devices. From your phone to your fridge, more and more devices are being connected to the internet. This allows them to share data and makes them easier to control. The internet of things is changing the way we live and work, and there’s no doubt that its impact will only continue to grow in the years to come.
- Application: #6: Data Science
Data science is a relatively new field that combines statistics, computer science, and domain expertise to extract knowledge from data. Data science is being used in a variety of industries, from healthcare to retail. And as data becomes increasingly abundant, data scientists will become even more important in helping organizations make sense of it all.
- Application: #7: Machine Learning
Machine learning is a subset of artificial intelligence that allows computers to learn from data without being explicitly programmed. Machine learning is being used in a number of different ways, from fraud detection to object recognition. As machine learning algorithms get more sophisticated, they will continue to revolutionize the way we live and work.
- Application: #8 Cybersecurity : Cybersecurity is a critical concern for businesses and individuals alike. With so much of our lives taking place online, it’s important to have measures in place to protect our information from hackers and cyber criminals.
These are just some of the many wonders of computer science and software engineering! Each one has the potential to change our world in amazing ways. We can’t wait to see what else these fields have in store for us!
Other notable wonders of computing:
#16 Mobile phones are handheld devices that allow us to make calls, send texts, and access the internet while on the go. They have become an indispensable part of our lives and have transformed the way we stay connected with others.
#17 Social Media
Social media platforms like Facebook, Twitter, and Instagram have changed the way we interact with each other. They provide us with a space to share our thoughts, feelings, and experiences with friends and family members who might be located anywhere in the world.
#18 Cloud Computing
Cloud computing is a model of computing that allows users to access data and applications over the internet. It has made it possible for businesses to operate more efficiently by reducing their reliance on physical infrastructure.
#19 Big Data
Big data refers to large data sets that can be analyzed to reveal patterns and trends. It is being used by businesses in a variety of industries to make better decisions about everything from product development to marketing strategies.
#20 Augmented Reality Augmented reality is a type of technology that overlays digital information on real-world objects. It has many potential applications, including education, gaming, and navigation.
#21 3D Printing 3D printing is a process of creating three-dimensional objects from a digital file. It has revolutionized manufacturing by making it possible to create customized products quickly and easily. These are just some of the things that computer science and software engineering have made possible! As you can see, these disciplines have had a major impact on our world and will continue to shape the future as we move into the digital age.
Conclusion: So there you have it! These are the top 10 wonders of computer science and software engineering according to me. Do you agree with my list? What would you add or remove? Let me know in the comments below!
In the spirit of choosing artifacts over ideas, I would replace “symbolic computation “ with Unix.
I’ve mentioned elsewhere that one can praise in a sentence or two, but criticism ethically demands more careful substantiation.
All I’ll say here is that when Unix was being invented Xerox Parc was already successfully making systems with dynamic objects that required no separate OS layering or system builds. That said, Doug McIlroy did find the start of what could have been really fruitful ideas when he implemented “pipes” programming. If they had seen what could have been done if they had reduced the process overhead to zero bytes, and gone to a dynamic language, then something great could have resulted. By Alan Kay
If you are looking for an all-in-one solution to help you prepare for the AWS Cloud Practitioner Certification Exam, look no further than this AWS Cloud Practitioner CCP CLFC01 book below.
What do you mean by organizing software systems like the internet?
Just to get you started: consider that the Internet’s (a) processes do not have to be stopped to fix, change, add to etc. (b) messages are not commands (c) units of computation are perfectly encapsulated (only the interior code code of a computer can decide to do anything or nothing) (d) units of transmission can be very badly damaged and messages will still get through, (e) scaling is more than 10 orders of magnitude (f) and on and on and on.
What SW systems to you know of that are remotely like this (that don’t depend intrinsically on what is wonderful about the Internet)?
This doesn’t mean the Internet is a perfect design at all. For example, the add-on of DNS was not nearly as good and lasting a scheme as the base semantics of TCP/IP. (It’s crazy that there are not unique IDs for every kind of entity manifested within the Internet system. Bob Kahn has been advocating this for several decades now — Joe Armstrong among others has pointed out that the modern hashing schemes (SHA256, etc.) are good enough to provide excellent unique IDs for Internet entities, etc.)
But the Internet did completely raise many bars qualitatively and astonishingly higher. It should be used as a starting point in thinking about how most SW systems can and should be organized.
Just a note about this big shift in thinking within the ARPA/Parc community — it is hard to pin down just when. But Wes Clark used to say that no computer is any good if it can’t work perfectly with 10% of its wires disconnected! Packet Switching (American version at RAND ARPA project in early 60s by Paul Baran) meant that you could do store and forward with a number of routes. If you made the protocol full-duplex, you could guarantee *eventually perfect* delivery of packets. At Parc the huge shift from “theoretical” to “punch in the face reality” came as we realized just how wonderfully well the extremely simple Ethernet was actually performing. This led to the articulated idea that no computation should ever require having to be stopped in order to improve/change/etc it.
In other words, make the systems design such that things will work “eventually as wished”, so you can spend your brain cells on (a) better designs, and (b) better optimizations. The Internet grew out of the whole community’s strong and emotional realizations around these ideas. By Alan Kay