In case you missed it, the U.S. Government responded to Elon Musk buying Twitter by establishing an office to control “disinformation” aka “The Ministry of Truth.”
Very disturbing.
Visitors to our web-site, and indeed to other web sites, may have noticed this ReCaptcha logo on its pages.
Like just about every web site on the planet, we have at times been inundated by spam where automated computer scripts known as bots try to automatically fill in forms to send marketing messages to web site owners, for whatever nefarious purpose they think it might achieve.
For some time, web site operators have used various methods to try to filter out the bots and ensure that only humans fill in the forms. Some of these methods include distorted letters, questions easily answered by humans but less so by computers (e.g. “what is the third word in this list?”), or my favourite: “tick this box if you are not a robot.”
The latest version is v. 3 which requires no human interaction at all. It sits in the background analysing the page clicks and interactions with the page to try to work out whether the user is likely to be a person. Bots are rejected and people are allowed through. If there is a grey area, a challenge screen is opened to try to confirm the identity of the user.
No doubt, a v. 4 will be needed soon as the arms race between web sites and invasive bots continues.
Wikipedia was to me something like a direct knowledge pipeline. Without the intermediary of lectures, grades, or commutes (not to mention social interaction), a simple keyword search could offer an evening of discovery.
When Wikipedia wasn’t particularly well-known, the (relatively) early adapters who made use of it suddenly became ultra-erudite flâneurs of the Interwebs (at least in their own minds), scholars of the finest grade among plebeian peers. I first experienced this in the late 2000s, though I’m pretty sure there are many of you who experienced these “powers” earlier.
Whereas “normal students” might have to visit the library for some information, Wikipedia gave resourceful students super powers in the ‘Language Arts,’ ‘Social Studies,’ or wherever else synthesizing coherent (enough) positions was demanded [1].
Even when the graders caught on to the fact that many students weren’t practicing research methods, but were just lazily repeating views they found on Wikipedia, cleverness found a way out:
It’s easy! Just because we aren’t allowed to cite Wikipedia doesn’t mean we can’t use Wikipedia’s citations!
Scroll… scroll.. scroll..
Ahhh. I see! There we go, finding five ‘academic sources’ wasn’t so hard at all. Didn’t even have to go to the library. ^_^
Finding sources for school papers Fast-forward to the Obama years. Going through school, Millennials, both Wiki-reading and not, had by-in-large failed to acquire skills in evaluating reason in arguments and the sources of evidence.
The growth of the Internet put in front of us the temptation for limitless knowledge. Without a cost? I, for one, donated to Wikipedia a couple of times in gratitude for the convenience and eye-opening infos it had allowed me to access.
Alas, I now realize the cost for relying heavily on Wikipedia was not just a matter of shekels. Many of us had already “paid” for Wikipedia our ability to criticially filter. Trusting the smarties to sort out what was good and true, we relied on Wikipedia…
Read the rest of the article here
Adam Ford nails it: