No moniker has come to describe our age better than the “digital age.” Indeed, nearly everything we do — shopping, interpersonal communication, entertainment — happens via the world’s massive network of digital technology and media.
Perhaps no development in this time period of technological advancement has been as groundbreaking as social media. With its power to connect friends and family from around the world, apps like Facebook, YouTube and Google have gone from emerging Silicon Valley startups to multibillion dollar giants with millions of daily users. But the tremendous growth of the few big tech companies has come with consequences, according to some.
As the growth continues to happen, computer science educators are trying to help students understand the rise of big tech in the classroom.
“It’s a thing that’s constantly on our mind: How we can talk about it to our students?” said Lane Computer Science teacher Mr. Berg of developments to the Lane CS curriculum.
Mr. Law, the Computer Science Department chair, also stressed the importance of learning about issues of data privacy. Concepts such as data, privacy and machine learning have, according to Law, been incorporated into nearly every Computer Science course at Lane, from Exploring Computer Science to the higher level classes.
“It is not possible to teach computer science nowadays without talking about data and the ethics of data,” Law said.
But these important developments and concepts affect everyone, not just computer science students.
One major consequence is the consolidation of user data. According to the FTC, this can come in the form of website cookies, browsing history or the information that users provide on social media.
“The ability that companies have to track you is pretty amazing,” Lane Computer Science teacher Mr. Stone said. “Your search history, your daily commute, the places you shop at, what you buy, who you talk with, your political views, all of that and more is accessible rather easily to these companies.”
In one example from 2012, a father stormed to his local Target and complained to the manager that his daughter was receiving coupons for cribs and baby clothes in the mail. Target knew something that the father did not: the teen was pregnant. Law said that this example is used often in Lane CS classes to show how machine-learning algorithms access and analyze our data and often end up knowing more about us than our close friends and relatives.
Law said he sees that as part of a trend of merging between marketing, data science and computer science.
“The field of marketing has been learning how to define data and figure out what a person wants or needs based on that data,” Law said. “They’ve been studying this for many decades now. They’ve learned how to do that with data that wasn’t even collected in some fancy way.”
Companies can scrutinize our personal data for the purpose of targeting advertisements at us. Technology scholar Shoshanna Zuboff has named the process of corporate data collection “surveillance capitalism.” Zuboff defines surveillance capitalism as “the unilateral claiming of private human experience as free raw material for translation into behavioral data.” That data is analyzed and used for commercial purposes.
The collection and use of this data often raises ethical and legal questions as to how companies are able to use the data of their users. In 2018, Facebook came under fire after it was revealed that the company allowed political analyst firm Cambridge Analytica to mine the data of up to 87 million Facebook users without their consent. Facebook was eventually fined $5 billion for the scandal, but questions of data privacy remain.
Tech companies use complex AI-driven recommendation algorithms that can entice users to click on new content based on what they have already seen.
For example, Facebook’s The News Feed algorithm ranks stories based on factors such as a user’s history of clicking links to a particular website. If a user regularly clicks on links from partisan or unreliable news sites, these sites have a greater chance of reappearing in their feed. According to research published by data scientists at Facebook, the News Feed algorithm reduces politically diverse content by 5% for conservatives and 8% for liberals.
According to YouTube Chief Product Officer Neal Mohan, 70% of what users watch on the site is fed to them by the recommendation algorithm. However, there is some evidence that the Youtube algorithm has served as a gateway to videos that peddle false conspiracies and other extreme views.
“As you start clicking on something you get content that’s a little bit more extreme and then a little bit more extreme and then you go down the rabbit hole,” said Mr. Berg, who also teaches Introduction to Artificial Intelligence.
So how should we as a society be thinking about big tech? Law said that he believes that curiosity and critical thinking on the part of technology users is important. He also doesn’t buy into doom-and-gloom thinking.
“As long as we have generations of students and generations of young 20-somethings and teenagers who continue to say, ‘What’s going on here?’ and as long as people continue to be curious, I am not too worried,” Law said. “I would be more concerned if people were showing apathy about it.”