Dear readers, be very careful online on Friday. News that President Trump has tested positive for coronavirus has created the kind of rapidly changing data environment in which we are susceptible to reading and percentages of fake or emotionally manipulated curtains online. This is already happening.
I discovered this in The Verge and the Washington Post as helpful guides to contribute to online confusion, unnecessary arguments and false information. A smart practice rule: If you have a strong emotional reaction to something, stay away from the screen.
Technology is neither fairer nor more efficient than people, sometimes we use it at all.
This is the message of Meredith Broussard, computer scientist, synthetic intelligence researcher and professor of knowledge journalism at New York University.
We discussed the recent explosion of generation-dependent schools to remotely monitor academics taking exams. Broussard told me that this was an example of another generation completely.
My colleagues reported this week on software designed to inform academics that they trick checks by doing things like tracking eye movements through a webcam. Academics told my colleagues and other hounds that it was insensitive and unfair to be suspected of cheating because they read the check questions aloud. , had snacks on their desks or did other things that the software considered suspicious.
Monitoring tests will never be perfect and the pandemic has forced many schools to adopt imperfect amenities for virtual education, but Broussard said the underlying challenge is that other people also mis-apply generation as a solution when they deserve to handle the challenge differently.
Instead of locating invasive and imperfect software to keep the test-taking procedure as general as you can imagine in incredibly abgeneral times, what if schools abandoned closed-book tests during a pandemic?
“Distance education has to be a little different and we can all adapt,” Broussard told me.
Broussard, who has written about the misuse of software to assign qualifications to academics for the New York Times Opinion segment, also said that schools have the opportunity to review check tracking software and other uses, assess whether it is helping academics, and leave without funding, sanction if not.
Broussard’s tactics of seeing the global step beyond education. He needs us all to reinvent the way we use technology, period.
There are two tactics of thinking about software or virtual knowledge to help make decisions at school and beyond: one technique is that imperfect effects require progressive generation or greater knowledge to make more important decisions. from images or video streams and has proven defective, especially for others with darker skin.
Broussard takes a momentary look. There is no effective way to design software to make social decisions, he said. Education is an equation of the computer, as is the application of law. Social inputs like racial and elegance biases are components of those systems, and the software will only magnify biases.
Repairing the computer code is the solution in those circumstances, Broussard said.
Talking to Broussard caused a transfer in my brain, but it took me a while. I kept asking him, “But what about . . . ?” Until I understood your message.
He doesn’t say he doesn’t use software to run into suspicious credit card transactions or to run into cancerous injuries imaginable on medical scans, but Broussard assumes we’ll have to be selective and careful about when and how we use technology.
We want to be more aware of when we seek to apply generation in spaces that are inherently social and human. Technology fails at that.
“The fantasy is that we can use computers to build a formula for a device to free us from all the mess of human interaction and human decision-making. It’s a deeply antisocial fantasy,” Broussard said. ” There is no way to build a device that takes us out of the unrest of humanity. “
This article is part of the On Tech newsletter. You can register here to get it the week.
Everybody tells Facebook to do one thing. He’s doing the opposite.
Those involved in spreading false conspiracy theories and incorrect information online have pointed to the risks of Facebook groups, meetings of others with unusual interests. Groups, especially those who are only invited, have places where other people can advertise fake fitness remedies and crazy ideas, and plan violent plots.
Facebook recommends teams, adding those that discuss extremist concepts, to others who browse their feeds. My colleague Sheera Frenkel told me that almost every expert I knew said Facebook deserves to avoid automated recommendations for teams true to misconceptions like the QAnon conspiracy. This is complicated because teams that focus on harmful concepts hide their concentration.
Facebook is aware of the disorders related to the organization’s recommendations and responds . . . making MORE recommendations for teams that are open to everyone, among the settings announced by Facebook on Thursday. block other people or topics in posts.
That’s Facebook’s answer: to make organization directors guilty of bad things, not Facebook. (To be fair, Facebook is doing more to focus on public teams, not personal teams where foreigners are less likely to see and report harmful activities. )But Facebook is not fully adopting a security measure that everyone is shouting from rooftops.
For what? Because it’s hard for other people and companies to change.
Like most Internet companies, Facebook has tried to grow. You need more people in more countries to use Facebook more and more greedily. Recommending other people sign up for teams is one way to get others to find more reasons to spend time. Facebook.
My colleague Mike Isaac told me that expansion can overcome all other Facebook imperatives. The company says it has a duty to protect others and not make a contribution to harmful information. But when other people’s protection conflicts with Facebook’s expansion mandate, expansion has a tendency to prevail.
When our taxes are spent to combat the serious problem: my colleague Patricia Cohen noted that some efforts to eliminate fraud in America’s state unemployment insurance systems have erroneously targeted the location of others who distort their eligibility to attack networks of criminals who borrow the identity of others, to defraud the government of money.
The pros and cons of payday prepayment apps: Apps like Earnin, which give others a preview of their pay meavers, have been a lifeline for many others during the pandemic. My colleague Tara Siegel Bernard also writes that programs have some of the same considerations as traditional payday lenders. : excessive fees or deceptive business practices that can trap others in costly debt cycles.
Seriously, things are crazy. Please take a good look: personally, I’m going to wallow in youTube videos of cooking rock star Sohla El-Waylly. See this and other recommendations in the New York Times Watching newsletter.
Crumpet, the cockatoo loves vegetables and sings wonderfully.
We need to listen to you. Tell us what you think of this newsletter and what else you would like us to explore. You can subscribe to us at [email protected] es. com.
If you still don’t receive this newsletter in your inbox, sign up here.
News Tech is the solution for passing tests through The New York Times partners.