Tech terms push out nature terms in kids’ dictionary

Blackberries.Photo
Some Canadians are upset that the new edition of the Oxford Junior Dictionary, a sort of practice dictionary for seven-year-olds, has dropped several words from nature in favor of tech terms. From the Canadian Press:

VANCOUVER — A B.C. environmental group is
flabbergasted that the publisher of the Oxford Junior Dictionary has
sent words like “beaver” and “dandelion” the way of the dodo bird.

In the latest version of its dictionary for schoolchildren, Oxford
University Press has cut nature terms such as heron, magpie, otter,
acorn, clover, ivy, sycamore, willow and blackberry.

In their place, the university publishing house has substituted
more modern terms, like the electronic Blackberry, blog, MP3 player,
voicemail and broadband.

Canadian wildlife artist and conservationist Robert Bateman, whose
Get to Know Program has been inspiring children to go outdoors and “get
to know” their wild neighbours for more than a decade, said the
decision is telling kids that nature just isn't that important.

“This is another nail in the coffin of human beings being
acquainted with nature,” Mr. Bateman said in an interview with The
Canadian Press.

“If you can't name things, how can you love them? And if you don't
love them, then you're not going to care a hoot about protecting them
or voting for issues that would protect them.”

[…]

“I don't want to sound like an old you-know-what, but I have a
feeling that quite a number of decisions are made by 20-somethings or
30-somethings,” he said. “There are a whole bunch of them out there who
were raised on Saturday morning cartoons and video games and not out in
nature.”

Mr. Bateman plans to fire off a letter to the university press brass in protest.

“I find it frightening what is happening, that people are losing a connection with nature,” he said.

Link: Nature lovers livid as 'blog' replaces 'beaver' in Oxford's junior dictionary.

I think the uproar is a bit silly, but still… broadband? Blackberry?

The photo of blackberries (the old-fashioned kind) is from wildmanstevebrill.com.

Lewis Lapham on the humanities in a technological age

This is from Lewis Lapham's preamble in the current issue of Lapham's Quarterly. The theme of the issue is Ways of Learning.

From time to time in the scholarly journals and the
alumni magazines I come across articles that might as well be entitled
“What in God’s Name Are the Humanities, and Why Are They of Any Use to
Us Here in the Bright Blue Technological Wonder of the 21st Century?”
The question suggests that within the circles of informed academic
opinion the authorities construe the humanities as exquisite ornaments,
meant to be preserved, together with the banknotes and the jewels, in
the vaults of the university’s endowment—an acquaintance with the
liberal arts one of those proper appearances that must be kept up,
together with the house in Southampton and the season’s subscription to
the Metropolitan Opera. Apparently content to believe that man’s
machines have vanquished nature, subjugated the tribes of Paleolithic
instinct, and put an end to history, the oracles in residence walk to
and fro among the old trees sold to the alumni as naming opportunities,
speaking of tenure and tables of organization, of Rembrandt’s drawings
and Shakespeare’s plays as pheasants under glass. Their piety recalls
the lines of Archibald MacLeish:

Freedom that was a thing to use
They made a thing to save
And staked it in and fenced it round
Like a dead man’s grave.

To
bury the humanities in the tombs of precious marble is to fail the quiz
on what constitutes a decent American education. Like the sorcerer’s
apprentice, our technologists produce continuously improved means
toward increasingly ill-defined ends; we have acquired a great many new
weapons and information systems, but we don’t know at what or at whom
to point the digital enhancements. Unless the executive sciences look
for advice and consent to the senate of the humanities, we stand a
better than even chance of murdering ourselves with our own toys. Not
to do so is to make a mistake that is both stupid and ahistorical.

Link: Playing with fire.

Nick Carr’s sources for “Is Google making us stupid?”

Nick Carr has posted a comprehensive list of sources and related readings for his Atlantic piece "Is Google Making Us Stupid?" This is excellent and worth digging into: "Is Google Making Us Stupid?": sources and notes.

As I wrote before, I think the key question is whether there is scientific evidence for these effects or not — and Carr references one study and a book (Proust and the Squid) that claim such evidence. That's the most powerful part of Carr's article, in my opinion, and I haven't seen a rebuttal that doesn't ignore it (and thus fail as a rebuttal) — to wit, the bloviating, er, debate about this article that continues at The Edge and other such forums.

The Tender Ears of the Blogosphere

Pretty much everyone and their dog has commented on Nick Carr's piece
"Is Google making us stupid?"  Most offer up banal anecdotes to counter
Carr's claim but ignore the primary sources/studies he mentions.  I
didn't offer my own opinion because I don't think this is a matter of
opinion, it's a matter of science.  Either research shows there is a
new effect or it doesn't.

Two responses in particular bother me.  First, Seth Finkelstein
criticised Carr for not being "technology-positive" enough and for
writing too much in the style of "fogeyism."  His worry is that techies
won't listen to people who sound old or cranky.  That may be true but
the answer isn't to water down criticism.  Part of growing up is
learning to listen to people unlike yourself — even people you
disagree with.  A technology background does not teach you to think
critically about technology and society; if anything it leaves you with
a deficit (yes, I speak from experience).

Link: http://sethf.com/infothought/blog/archives/001349.html

The
second response is by Danah Boyd and I don't know whether she's talking
about Carr's piece or something else, but I'll assume she is (my second
guess is Mark Bauerlein's The Dumbest Generation).  Her post is another
meta-comment and is about how to respond to "quasi-legitimate trolls in
an attention economy."  She characterizes some writers as
attention-seeking trolls and is having trouble ignoring them so asks
for advice.  I asked in a comment for clarification of what
defines a troll vs. a rational critic you disagree with and also what
books she was talking about.  I was rebuffed so I won't ask again —
I'm afraid of appearing to be a troll myself.

My problem with Boyd's point is that it's grossly unfair to call
Carr or Bauerlein trolls (Keen and Siegel may be a little closer, but
still don't meet the definition in my opinion).  To be a troll (a term
borrowed from the Internet, of course) implies an irrational
attention-seeker who ignores logic and simply repeats their opinion to
annoy someone.  These writers, however, are drawing on real evidence to support
their arguments and are engaging in rational discussion.  They may be wrong but they deserve an intelligent response.

There's an irony in Boyd's post — she claims Internet-style trolls
are showing up more and more in real life.  What she misses is that
maybe real life is the same and what has transferred over from the internet is the habit
of labeling people as trolls as an excuse not to listen to them.

Link: http://www.zephoria.org/thoughts/archives/2008/06/22/feeding_quasile.html

I
started this blog three years ago to try to point out the many good
books that have been written on technology's impact on society, as well
as the excellent work that continues to be done by people in fields
such as science and technology studies.  What still surprises me is how
shallow and closed-minded most discussion on the Internet tends to be. 
Most of the smartest stuff is still offline.

College Without Technology

Wyoming Catholic College limits students’ use of cell phones and computers, and the students seem to be doing just fine.  From the Casper Star-Tribune:

In an era when technology is king,
Wyoming Catholic College is positing an against-the-grain conviction:
that great advances in technological achievement, while widely
celebrated, might not in fact be good for people. And they might
actually get in the way of education.

Here, students are
encouraged, and in many ways required, to forgo the world of virtual
connectivity, and engage with the actual world — to go out into the
woods, the mountains and the horse stables and experience what college
officials refer to as "God’s first book."

Student Hannah Gaddis of Casper said the school’s
curriculum kept her so busy and engaged that she never had time to give
the school’s strict technology policy a second thought.

"You kind of realize how much you don’t need these things," she said.

Link: An Audacious Experiment.

I learned about this first from an NPR story (No Tech U) in which they interview a student who clearly gets that technology skills are not that big a deal and not hard to learn when you need them. 

Of course there are other aspects of this school that may not be everyone’s cup of tea — like the exclusively religious and "great (Western) books" curriculum and the apparent endorsement by Bill Bennett.  In America those don’t raise eyebrows (not that they should, necessarily) — but banning iPods sure does.

Negroponte’s $100 laptop is no longer about learning, if it ever was

Ivan Krstic, formerly the director of security architecture for the (now failing rather spectacularly) One Laptop Per Child project, has some strong words about the project’s philosophies, its leader, and the free-software gurus who hijacked the project to push their own agendas.  From his blog:

I quit when Nicholas told me — and not just me — that learning was
never part of the mission. The mission was, in his mind, always getting
as many laptops as possible out there; to say anything about learning
would be presumptuous, and so he doesn’t want OLPC to have a software
team, a hardware team, or a deployment team going forward.

Yeah, I’m not sure what that leaves either.

There are three key problems in one-to-one computer programs:
choosing a suitable device, getting it to children, and using it to
create sustainable learning and teaching experiences. They’re listed in
order of exponentially increasing difficulty.

[…]

That OLPC was never serious about solving deployment, and that it
seems to no longer be interested in even trying, is criminal. Left
uncorrected, it will turn the project into a historical information
technology fuckup unparalleled in scale.

As for the last key problem, transforming laptops into learning is a
non-trivial leap of logic, and one that remains inadequately explained.
No, we don’t know that it’ll work, especially not without teachers. And
that’s okay — the way to find out whether it works might well be by
trying. Sometimes you have to run before you can walk, yeah? But most
of us who joined OLPC believed that the educational ideology
behind the project is what actually set it apart from similar endeavors
in the past. Learning which is open, collaborative, shared, and
exploratory — we thought that’s what could make OLPC work. Because
people have tried plain laptop learning projects in the past, and as the New York Times noted on its front page not so long ago, they crashed and burned.

Nicholas’ new OLPC is dropping those pesky education goals from the
mission and turning itself into a 50-person nonprofit laptop
manufacturer, competing with Lenovo, Dell, Apple, Asus, HP and Intel on their home turf, and by using the one strategy we know doesn’t work. But hey, I guess they’ll sell more laptops that way.

Link: Sic Transit Gloria Laptopi,

via Fake Steve Jobs.