Computing

Computing, or Computation Science, is now considered the third pillar of scientific inquiry, alongside theory and experimentation. It is also a natural science, not an artificial one, and highly interdisciplinary. Computing practitioners can find careers in dozens of fields.

Computational Systems in Nature

Computing is a science of information processes. Information processes occur in:

Information is perhaps fundamental to understanding the universe. The so-called “speed of light” is actually the speed of causality, a speed that information cannot outrun.

Computing is a natural science. Computation is a principle, not a tool.

Exercise: Find answers (via an online search if you wish) to the question of how many bits per second can the conscious mind process. Which researchers came up with that value? How did they measure that?

Disciplines Within Computing

Computing, a.k.a. Computation Science, or Informatics, has several subfields:

Computer Science (CS)
The theory and practice of computation, algorithms, software systems, data organization, knowledge representation, language, intelligence, and learning.
Software Engineering (SE)
The design, organization, and construction of large-scale, often mission-critical (software) systems, with a focus on product efficiency, reliability, robustness, testing, maintenance, and cost-effectiveness.
Computer Engineering (CE)
The design of digital systems such as communications systems, computers, cell phones, digital audio players, digital video recorders, alarm systems, x-ray machines, and laser surgical tools.
Information Technology (IT)
The construction, maintenance, and troubleshooting of an organization’s computing infrastructure (both hardware and software), including networks, email systems, web sites, databases, and telephony. IT work generally involves configuration and upgrading in addition to programming.
Information Systems (IS)
The design of computing solutions for companies, non-profit organizations, educational institutions, and governments to support their mission and improve their effectiveness. IS is generally taught in business schools.

Questions

The four questions that have motivated advances in the field of computing:

  1. What is computation?
  2. What is information?
  3. What can we know through computing?
  4. What can we not know through computing?

Principles of Computing

Of course it’s important to focus on technical details of computing systems to make things happen, but it’s also useful to take an academic big picture look at the field and ask about its big ideas or underlying principles. In 2007, Peter Denning catalogued what he called the great principles of computing. He grouped these principles into seven categories.

  1. Computation (A science of information processes)
  2. Communication (Reliable data transmission)
  3. Coordination (Cooperation among networked entities)
  4. Recollection (Storage and retrieval of information)
  5. Automation (Delegating tasks to computational systems)
  6. Evaluation (Performance prediction and capacity planning)
  7. Design (Building reliable software systems)

There’s no rule saying these are be-all and end-all of computing categories, and given that they were dreamed up in 2007, perhaps things have changed?

Exercise: These principles sure seem engineering-influenced, or science-influenced, or technology-influenced. Can you think of any socially-influenced themes that might suggest a new category, or a different structure for this ontology?

Myths about Computing

The following myths have done a lot to harm the image of computing as a field and kept much-needed talent away:

All B.S.! Here’s some debunking (from quite a few years ago, but still relevant):

10 Myths for Computer Science from Thanos Hatziapostolou
Exercise: Do some research and find some more refutations.

Perhaps the largest myth is that computing is only about programming, and more specifically, programming as if people don’t matter. Programs are written by people for people. Those developing computing systems, both hardware and software, need to be aware of who their products are serving, and take into account the needs and experiences of their users. Here are some reads on modern user experience trends:

History

There is a wealth of information about computing history. See:

Let’s go back in time to 2012. What were computers like then?

Computing is so much more than its technical core, its theoretical concepts, or even its history. Computing is a human activity. Computations, programming languages, and machines, are designed by humans for humans.

Grady Booch is building a remarkable transmedia documentary on the intersection of computing and the human experience.

Computing: The Human Experience thumbnail

The web version introduces itself as:

The story of computing is the story of humanity: this is a story of ambition, invention, creativity, vision, avarice, power, and serendipity, powered by a refusal to accept the limits of our bodies and our minds. Computing: The Human Experience is a transmedia project that explores the science of computing, examines the connections among computing, individuals, society, and nations, and by considering the history of computing contemplates the forces that will shape its future.

The story of computing is “powered by a refusal to accept the limits of our bodies and our minds”

It is worth immersing yourself in this content for hours.

Exercise: Spend an hour exploring the site.

Computing in the News

Please visit ACM TechNews. This should show you that computing goes way beyond the popular limited perceptions of the discipline.

Computational Thinking

There’s this term you might have heard: computational thinking (CT). It was used a bit near the turn of the century. Now, not so much. What does it mean?

The term computational thinking was popularized by this three-page article by Jeanette Wing.

However, Wing only got a part of the whole picture! Lorena Barba has written a nice article claiming computational thinking does not mean what you think it means.

Exercise: Read both the Wing and Barba articles. Barba refers to Papert’s Power Principle. What is the Power Principle exactly?

Careful about putting too much on to the “computational thinking” aspect, though:

...our goal should not be Computational Thinking, but Science, Mathematics, History, Engineering, and Everything Thinking. Computing is a tool that can be used to learn everything else better. The root of that argument is in the roots of our field.

Mark Guzdial

Careers

If you have a good, solid undergraduate education in Computing (especially Computer Science), you will gain skills enabling you to go on to careers in:

Here’s another one: Live Coding is programming as a performance art. From Wikipedia:

Live coding ... is a performing arts form and a creativity technique centred upon the writing of source code and the use of interactive programming in an improvised way. Live coding is often used to create sound and image based digital media, as well as light systems, improvised dance and poetry, though is particularly prevalent in computer music usually as improvisation, although it could be combined with algorithmic composition.

Here’s a little Sonic Pi performance:

Exercise: Explore Sonic Pi. Optionally, practice building some tracks. At the end of the course, consider giving a live coding performance to the class.

How about a career in electronic voting? Um....

Professional Practice

Computing has elements of science, art, craft, and engineering. Computing practice is generally very collaborative and also interdisciplinary. To do well, you should possess various soft skills in:

and you should have the following personal attributes:

Gender issues in the computing field are often in focus; here are some interesting resources for learning more:

Exercise: Read the Turkle and Papert paper.

It’s not just gender that gets discussed in the world of tech; there are larger issues surrounding race, allyship, intersectionality, you name it. The idea of belonging in tech is a subject of study. More resources:

Thanks to friends at Google and UpperlineCode for curating a good portion of this list.

Occupational Outlook

Jobs for graduates with degrees in Computer Science, Software Engineering, AI Engineering, and related fields remain plentiful, but not all jobs within the industry are equally in demand, and the demand varies greatly by experience. Check out the U.S. Bureau of Labor Statistics Outlook for the following professions in the Computer and Information Technology sector:

The increasing automation of software production has changed the landscape a great deal.

The BLS also has some projections on how AI might impact employment in the coming years.

Summary

We’ve covered:

  • How computing is a natural, not an artificial, science
  • Several subdisciplines of computing
  • Principles of computing
  • Myths about computing
  • The idea of computational thinking (and how it has changed)
  • The variety of careers that a computing education can prepare you for
  • Aspects of the professional practice of computing
  • Resources on belonging in tech
  • Occupational outlooks for different professions in computing