Find out how artificial intelligence could impact the future workforce with this video and educational resources from PBS NewsHour. This video comes with a handout the includes discussion questions. This video can be played during a lesson on researching the impact of computing technology on possible career pathways.
In this video excerpted from Pathways to Technology, you'll learn about the wide range of jobs that an information technology (IT) degree can make possible. From PCs to smartphones to cars, almost every tool we use today has computer technology embedded in it. The IT specialist is the person who keeps those computers operating and finds ways to make them run faster and more smoothly, so we can all get our work done. This video has discussion questions.
We’re going to take a step back from programming and discuss the person who formulated many of the theoretical concepts that underlie modern computation - the father of computer science himself: Alan Turing. Normally, we try to avoid “Great Man" history in Crash Course because, truthfully, all milestones in humanity are much more complex than an individual or single lens.
We’re going to step back from hardware and software, and take a closer look at how the backdrop of the cold war and space race and the rise of consumerism and globalization brought us from huge, expensive codebreaking machines in the 1940s to affordable handhelds and personal computers in the 1970s. This is an era that saw huge government-funded projects - like the race to the moon. And afterward, a shift towards the individual consumer, commoditization of components, and the rise of the Japanese electronics industry.
We're going to talk about the birth of personal computing. Up until the early 1970s components were just too expensive, or underpowered, for making a useful computer for an individual, but this would begin to change with the introduction of the Altair 8800 in 1975. In the years that follow, we'll see the founding of Microsoft and Apple and the creation of the 1977 Trinity: The Apple II, Tandy TRS-80, and Commodore PET 2001. These new consumer-oriented computers would become a huge hit, but arguably the biggest success of the era came with the release of the IBM PC in 1981. IBM completely changed the industry as its "IBM compatible" open architecture consolidated most of the industry except for, notably, Apple. Apple chose a closed architecture forming the basis of the Mac versus PC debate that rages today. But in 1984, when Apple was losing market share fast it looked for a way to offer a new user experience like none other - which we'll discuss next time.
Learn how game developers are shaping the future of video games in this video from SciTech Now. Fifty-nine percent of Americans play video games, which leads many people to believe that games are becoming just as important culturally as television and movies. The Game Innovation Lab at NYU Polytechnic School of Engineering is dedicated to exploring new ideas in video games. Students and faculty there blend computer science, art, math, education, and design to create innovative games. This video comes with discussion questions.
Learn about the exciting opportunities for and vital roles played by Cyber Security Analysts in the growing field of Cyber Security. Watch this Science Matters video to learn what it means to be a Cyber Security Analyst and see if this job could be the right fit for you.
Worms. Trojan Horses. Hackers and Clouds. It is not what’s happening in the movies, but what’s happening in cyberspace- that area that connects all of us and our computers to the internet and each other. Cyber Security is important to all of us and it's a HOT JOB of the future. This video comes with discussion questions.
Robots are often thought of as a technology of the future, but they're already here by the millions in the workplace, our homes, and pretty soon on the roads. We'll discuss the origins of robotics to its proliferation and even look at some common control designs that were implemented to make them more useful in the workplace.
In the past 70 years, electronic computing has fundamentally changed how we live our lives, and we believe it’s just getting started. From ubiquitous computing, artificial intelligence, and self-driving cars to brain-computer interfaces, wearable computers, and maybe even the singularity there is so much amazing potential on the horizon.
Welcome to Crash Course Computer Science! This video will take a look at computing’s origins because even though our digital computers are relatively new, the need for computation is not.
We ended the last episode at the start of the 20th century with special-purpose computing devices such as Herman Hollerith’s tabulating machines. But the scale of human civilization continued to grow, as did the demand for more sophisticated and powerful devices. Soon, these cabinet-sized electro-mechanical computers would grow into room-sized behemoths that were prone to errors. But it was these computers that would help usher in a new era of computation - electronic computing.