How Much Bias Is Okay in Your School?
The algorithm has proven itself to be a handy tool when it comes to solving education problems. It’s also not without bias.
You may be wondering how some as benign as a simple math formula can be harmful. Is essence, the algorithm present little, if any danger. It’s what happens next that alter your child’s future.
Algorithms are part of machine learning. Currently, computers learn what we teach them, although there is growing agreement that one day, computers will be able to teach themselves.
Here’s why you want to know everything you can about the algorithms being used in your child’s school.
The two questions you must ask
Edtech companies have nearly unlimited access to your child’s personal data. If you’re a parent, you’ll want to know two things about the algorithms being used in classroom software.
1. What is being collected, and why?
Collecting large amounts of data allows schools and edtech companies to spot trends, make predictions and adjust for future needs. However, not all data being collected on your child may be necessary or without bias. The data certainly shouldn’t be identifiable, meaning that anyone could tell that the data represent your child. Data to be concerned about include:
o Social Security/State ID numbers
o Physical addresses
o Medical history
o Family financial information
o Social connections
o Facial recognition photos
o Biometrics
Ultimately, you want to know if the data be used to predict academic standing or the likelihood of behavioral challenges. Determine how accurate the predictions are and whether the algorithms in machine learning is the only way your child will be evaluated.
2. Where is the data stored?
Edtech software algorithms collect and sort data. Then they store it. Billions of bytes of data sit housed in servers or in the cloud. None of them are immune from data security breaches. Ask about:
· Data protection levels
· Policies and procedures for using USBs and mobile devices
· What happens to the data at the end of the school year or at graduation
Only persons with a legitimate need to know should have access to your child’s personal information. That’s not just a good idea – it’s federal law, under FERPA.
If unscrupulous persons get a hold of your child’s data, could they act in malice based on bias? For example, could a future employer use school information as a bias against hiring someone based on their accumulated school data?
The other side of the story
Districts do what they can to protect student data and bias, but once an edtech company collects the data, your child’s privacy is at risk. Many companies are conscientious about data protection, but what happens to the data if the company is sold?
Edtech companies also have a valid concern. If they must share their algorithms and explain how their collection system works, they are making public their proprietary software. Ultimately, the public release of the algorithms could present a safety concern.
Revealing the steps in an algorithm would allow anyone with bad intentions to deconstruct the formulas and attack entire systems. In cities, that means infrastructures like utilities and public transportation systems. In schools, your child’s historical information (academic, medical, biographic) could be held for ransom or used against them later in life.
Metroplexes like New York City have insisted on passing legislation that would increase transparency to the point that personal privacy – even of minors – would be eliminated.
Perhaps we have allowed the algorithm to become too useful in helping us educate our children. We need to be more protective of our children to keep them safe and prevent possible bias.