Have you ever wondered what gives computers such remarkable power and flexibility? One answer is that computer designers and software developers build them that way. But that’s not entirely satisfying as an answer. Computing for Ordinary Mortals starts in a different way:
Showing posts with label education. Show all posts
Showing posts with label education. Show all posts
Monday, December 31, 2012
The book
Have you ever wondered what gives computers such remarkable power and flexibility? One answer is that computer designers and software developers build them that way. But that’s not entirely satisfying as an answer. Computing for Ordinary Mortals starts in a different way:
Thursday, November 8, 2012
Ordinary Mortals and CS education
This post can also be read as a public Google doc.
I’ve written Computing for Ordinary Mortals for readers without a technical background or even much experience in computing. My thought was this: If you wanted to explain what computing is all about, starting from scratch, what what would you say? You have a tremendous number of decisions to make, about which topics are critical and which can be left out, about the ordering and detail of topics you include, and about how the topics fit together into a comprehensive whole. For what it’s worth, some computer scientists will make decisions different from mine. Most popular science books and textbooks go into greater detail about representing and managing data; I delay a discussion of programming concepts until after algorithms and abstract data types; I punt on the question of whether computer science is a branch of applied mathematics (see Dijkstra’s “How do we tell truths that might hurt?” [PDF], though he was talking about programming), or a branch of science (Newell, Perlis, and Simon’s “What is computer science?”), or a branch of engineering (Eden’s “Three Paradigms of Computer Science” [PDF]), or perhaps something different (Rosenbloom’s On Computing, or Graham’s “Hackers and painters”).
I’ve written Computing for Ordinary Mortals for readers without a technical background or even much experience in computing. My thought was this: If you wanted to explain what computing is all about, starting from scratch, what what would you say? You have a tremendous number of decisions to make, about which topics are critical and which can be left out, about the ordering and detail of topics you include, and about how the topics fit together into a comprehensive whole. For what it’s worth, some computer scientists will make decisions different from mine. Most popular science books and textbooks go into greater detail about representing and managing data; I delay a discussion of programming concepts until after algorithms and abstract data types; I punt on the question of whether computer science is a branch of applied mathematics (see Dijkstra’s “How do we tell truths that might hurt?” [PDF], though he was talking about programming), or a branch of science (Newell, Perlis, and Simon’s “What is computer science?”), or a branch of engineering (Eden’s “Three Paradigms of Computer Science” [PDF]), or perhaps something different (Rosenbloom’s On Computing, or Graham’s “Hackers and painters”).
Writing a popular science book on computing means taking a stand on such issues, but the constraints of the genre didn’t make it easy for me to say, “Here’s what I’m doing...” That’s in part what this document is for, to identify the connections between Ordinary Mortals and the field of computer science, at least as it’s currently taught at the university and secondary school levels.
Tuesday, October 30, 2012
The Big Idea (Whatever)
When I was ten years old or so, I saw a battered paperback copy of Triplanetary on my grandfather’s bookshelf. I borrowed it… and found myself in ten-year-old heaven. Science fiction led me to popular science, with Isaac Asimov (and Edgar Cayce, embarrassingly enough) to help me cross the boundary. I read about physics, space, biology, math, and psychology. It was formative reading. Today I’m a computer scientist, and I’ve just written my own book.
The big idea in Computing for Ordinary Mortals is that the basics of computer science can be conveyed through stories. Not stories about computers and how we use them, but stories about other kinds of everyday things we do. Computing is more about abstract concepts than about hardware or software, and we can understand these concepts through analogies to what happens in the real world.
Read the rest in a Big Ideas post on John Scalzi's Whatever blog.
Wednesday, September 12, 2012
Computer programming is the new literacy (OUP)
It’s widely held that computer programming is the new literacy. (Disagreement can be found, even among computing professionals, but it’s not nearly as common.) It’s an effective analogy. We all agree that everyone should be literate, and we might see a natural association between writing letters for people to read and writing programs for computers to carry out. We also find a historical parallel to the pre-Gutenberg days, when written communication was the purview mainly of the aristocracy and professional scribes. Computation is an enormously valuable resource, and we’re only beginning to explore the implications of its being inexpensively and almost universally available.
But is programming-as-literacy an appropriate analogy? We tend to think that basic literacy is achieved by someone who can say, “Yes, I can read and write.” Let’s see what this means in the context of programming.
Read the rest on the OUPblog, "Oxford University Press's Academic Insights for the Thinking World."
Read the rest on the OUPblog, "Oxford University Press's Academic Insights for the Thinking World."
Wednesday, September 5, 2012
Code Year: Why You Should Learn to Code (HuffPo)
You probably know about Code Year. Code Year, sponsored by Codecademy, challenges people to learn how to program in 2012. The Codeacademy website offers free online lessons in a variety of programming languages; it's received attention in the press and saw a large boost from a comment from New York Mayor Michael Bloomberg on Twitter: "My New Year's resolution is to learn to code with Codecademy in 2012! Join me."
Hundreds of thousands of people have joined Bloomberg. Even though my own Code Year was 30 years ago, I can still appreciate the appeal -- you'll learn how to write software to make your computer do new and wonderful things that youfind valuable, instead of depending only on what others have done. That's empowering.
Friday, May 25, 2012
An ongoing revolution... in computing education
These days a lot of people seem to be thinking, "Maybe I could try one of those free online courses and learn how to program." Others say, "What's the point?" (Juliet Waters, in blogging about her New Year’s resolution to learn how to code, explains what the point is.) Some even say, "No! Please don't learn to code!" Fortunately, the last category holds only a tiny minority of people.
The past six months have seen a surge of public interest in computing. The UK is refocusing its pre-university curriculum on information and communications technology to concentrate on the science of computing. (This is good timing; 2012 marks the centenary of the birth of Alan Turing, the London-born founder of computer science.) In the New York Times, Randall Stross writes about computational thinking as a fundamental skill for everyone. When even the mayor of New York City decides to join Code Academy to learn how to program, people take notice. A minor revolution is underway in formal and informal computing education.
The past six months have seen a surge of public interest in computing. The UK is refocusing its pre-university curriculum on information and communications technology to concentrate on the science of computing. (This is good timing; 2012 marks the centenary of the birth of Alan Turing, the London-born founder of computer science.) In the New York Times, Randall Stross writes about computational thinking as a fundamental skill for everyone. When even the mayor of New York City decides to join Code Academy to learn how to program, people take notice. A minor revolution is underway in formal and informal computing education.
Thursday, May 17, 2012
Experiencing Design: One’s own experience
For a few years I wrote a column on human-computer interaction (HCI) for the British quarterly trade magazine, Interface. This is one, slightly revised.
Every year I greet a new group of computer science students
who have signed up for my HCI course. By the end of the
semester, most of them will have a reasonable grasp of the basics of HCI, and
some will even be enthusiastic about the topic. Projects turned
in by students, working in teams, have included a voice-controlled video game,
a gesture-controlled Web browser, a social networking application for gamers,
and a variety of personal information organizers, on the desktop as well as on
cell phones and other mobile devices.
Over the past ten years or so I've noticed students becoming
more interested in applications that push the bounds of what's currently
possible. The projects generally target what Jonathan Grudin calls discretionary hands-on use (Three Faces of
Human-Computer Interaction, IEEE Annals
of the History of Computing, 2005). That is, students are less interested
in building a better calendar system, financial planner, or electronic voting
ballot; they look to applications and devices that fit into the natural and
often optional activities of our everyday lives. How can I contact my friends?
Could I play a familiar game in a different way? What would people like to do
with their mobile phones that isn't easy to do now?
Subscribe to:
Posts (Atom)