Almost everyone can agree that one of the big differences between us and our ancestors of five hundred years ago is that they lived in an “enchanted” world, and we do not; at the very least, we live in a much less “enchanted” world. We might think of this as our having “lost” a number of beliefs and the practices which they made possible. But more, the enchanted world was one in which these forces could cross a porous boundary and shape our lives, psychic and physical. One of the big differences between us and them is that we live with a much firmer sense of the boundary between self and other. We are “buffered” selves. We have changed.
Do your work, your best work, the work that matters to you. For some people, you can say, “hey, it’s not for you.” That’s okay. If you try to delight the undelightable, you’ve made yourself miserable for no reason.
It’s sort of silly to make yourself miserable, but at least you ought to reserve it for times when you have a good reason.
The novel is novel, but it is also, typically, news—the tidings of the world around us. … The novel reaches in and out at once. Like no other art, not poetry or music on the one hand, not photography or movies on the other, it joins the self to the world, puts the self in the world, does the deep dive of interiority and surveils the social scope. …
The self in society: the modern question. The novel is coeval with other phenomena that first appeared in full-fledged form in the 18th century—like privacy and sensibility and sentiment and boredom, all of which are closely linked to its development. Novel-reading is indeed unusually private, unusually personal, unusually intimate. It doesn’t happen out there, in front of our eyes; it happens in here, in our heads. The form’s relationship to time is also unique. The novel isn’t static, like painting and sculpture, but though it tells a story, it doesn’t unfold in an inexorable progression, like music, dance, theater, or film. The reader, not the clock, controls the pace. The novel allows you the freedom to pause: to savor a phrase, contemplate a meaning, daydream about an image, absorb the impact of a revelation—make the experience uniquely your own.
More than with any other form of art, the relationships we have with novels are apt to approach the kind we have with people. For a long time, novels were typically named after people (Tom Jones, Emma, Jane Eyre), but that is not the crux of it. What makes our experience of novels so personal is not that they have protagonists, but that they have narrators. Paintings and photographs don’t, and neither, with rare (and usually unfortunate) exception, do movies or plays. Novels bring another subjectivity before us; they give us the illusion of being addressed by a human being.
They are also exceptionally good at representing subjectivity, at making us feel what it’s like to inhabit a character’s mind. … The camera proposes, by its nature, an objectivist aesthetic; its techniques are very crude for representing that which can’t be seen, the inner life. (“I hate cameras,” Schmidt quotes Steinbeck as having remarked. “They are so much more sure than I am about everything.”) …
Novels call us out. “In the intensity of our engagement,” Schmidt remarks, “we ourselves are judged.” … As the characters are tested, so are we. What you read becomes a mark of election, and even more, how. (“Books—oh! no,” says Elizabeth Bennet to Mr. Darcy. “I am sure we never read the same, or not with the same feelings.”) The novel was a smithy, perhaps the smithy, in which the modern consciousness was forged.
The modern consciousness, but not the postmodern one. The novel’s days of cultural preeminence have long since gone. The form rose to primacy across the 19th century, achieved a zenith of prestige in modernism, then yielded pride of place to the new visual media. It is no accident, perhaps, that the modernist anni mirabiles after the First World War (the years of Ulysses, Proust, Mrs. Dalloway, The Magic Mountain, The Great Gatsby, and others) directly preceded the invention of the talkies—a last, astounding efflorescence.
This is not to say that great novels haven’t continued and won’t continue to be written. It is to start to understand why people have been mooting the “death of the novel” ever since that shift in cultural attention, as well as why the possibility is met, by some, as such a calamity. Privacy, solitude, the slow accumulation of the soul, the extended encounter with others—the modern self may be passing away, but for those who still have one, its loss is not a little thing. Schmidt reminds us what’s at stake, for novels and their intercourse with selves. The Novel isn’t just a marvelous account of what the form can do; it is also a record, in the figure who appears in its pages, of what it can do to us. The book is a biography in that sense, too. Its protagonist is Schmidt himself, a single reader singularly reading.
The Manufacturers may finally be getting a break in price drops because Moore’s Law is no longer ruling the roost.
The factor not included in the data is the lack of increase in processor speeds since those are limited by frequency. If you were graphing the quality of the transistor per dollar, rather than the price per “machine” you would have seen a very different story. Customers are no longer buying machines that offer a compelling improvement in processing speed. A 4 year old computer today is now more analogous to a 4 year old TV rather than a 4 year old computer from 1999. The power envelope is the limiting factor not the geometry that drove the earlier heyday of Moore’s Law. One result of this is low powered devices were not limited by the same problems and actually were able to ramp frequency while computer’s could not. A cell phone now offers much of the functionality of a $400 computer. Sales of traditional computer’s will continue to drop because the product turnover is dropping, and there is more competition from other kinds of devices. Dropping demand for computer’s definitely means there is less investment money to pay for the next processor node.
What I am curious about is why has the obvious slow down in increases of computer speeds been so ignored by the press? There has been more technology innovation than in the 90’s, but we have seen less growth in productivity. Perhaps if we could name the problem, the solution would get more attention. Two types of processors have continued to improve while CPU’s of average PC’s have stagnated. One is the graphics processor because of it’s ability to harness more improvement from running multiple cores, and the other is the low power computing devices. Area’s economic productivity have followed in the wake of each of these two exceptions. Petroleum exploration has benefited from the ability to find oil that was not practical to extract because this is processed with multicore machines with large graphics processors. Cell phones are another growth area with smart phones leading the way towards new business innovation. Apple’s big bet on ARM makes much more sense if we look at what happened to the 5th generation PowerPC chip and how it failed to match the growth in processor speeds that Intel was producing.
… a really serious underlying problem in our networked world — the stupendous power that superior knowledge, IQ and technical understanding confers on some people. We are completely dependent on systems that are so complex that virtually nobody understands how they work — and how they can be manipulated and gamed by those who do understand them. The obvious rejoinder is “twas ever thus”, but I think that’s too complacent. What’s different now is that the level of technical expertise needed is beyond the reach or capacity of almost everyone. Which means that the elites who do ‘get’ it — and those who employ them — have colossal power.
Every act of regulation by authority is an erosion of liberty. That tells us what liberty is, and that you can have too much of a good thing. Liberty pushed to extreme is anarchy. Regulation pushed to extreme is dictatorship. Millions of words have been devoted to finding the balance, and the question remains open. The collective drift towards more regulation in the western liberal democratic model is driven by good intentions and by a mad dream of perfect fairness in which individual discretion and individual responsibility are intrinsically subversive. Infants and madmen used to be the traditional exceptions to the general notion that people should be trusted to make their own accommodations with each other, and that authority is not there to do our thinking for us. We are all halfway to being treated like infants and madmen now. As civilisation advances in complexity, liberties give way. So be it, but it’s as well to know and name the retreat of liberty for what it is, and not to call it something else, before the retreat becomes a rout.
The Independent revealed in October 2013 that Paterson [Secretary of State for Environment, Food and Rural Affairs] has never been briefed on climate change by the government’s chief scientific adviser Sir Ian Boyd and refused to take a briefing offered to him by Professor David MacKay, the chief scientific adviser at the Department of Energy and Climate Change (DECC). On Monday, a cabinet colleague of Paterson’s told the Mail Online: “He isn’t climate sceptic, he’s climate stupid.’”