November 23, 2014

NYT: The Secret Life of Passwords

A fascinating story on many levels. The story as published by the New York Times includes videos of personal anecdotes related to events surrounding a person's experiences linked to some element of passwords. The opening piece about Howard Lutnik is quite moving especially when you hear him discuss his unique situation in the accompanying audio. I suspect very few of us employ perfectly random passwords. If so, you should find this piece quite an interesting read.

The Secret Life of Passwords

We despise them – yet we imbue them with our hopes and dreams, our dearest memories, our deepest meanings. They unlock much more than our accounts.
    By Ian Urbina Video by Leslye Davis

Howard Lutnick, the chief executive of Cantor Fitzgerald, one of the world’s largest financial-services firms, still cries when he talks about it. Not long after the planes struck the twin towers, killing 658 of his co-workers and friends, including his brother, one of the first things on Lutnick’s mind was passwords. This may seem callous, but it was not.

Like virtually everyone else caught up in the events that day, Lutnick, who had taken the morning off to escort his son, Kyle, to his first day of kindergarten, was in shock. But he was also the one person most responsible for ensuring the viability of his company. The biggest threat to that survival became apparent almost immediately: No one knew the passwords for hundreds of accounts and files that were needed to get back online in time for the reopening of the bond markets. Cantor Fitzgerald did have extensive contingency plans in place, including a requirement that all employees tell their work passwords to four nearby colleagues. But now a large majority of the firm’s 960 New York employees were dead. “We were thinking of a major fire,” Lutnick said. “No one in those days had ever thought of an entire four-to-six-block radius being destroyed.” The attacks also knocked out one of the company’s main backup servers, which were housed, at what until that day seemed like a safe distance away, under 2 World Trade Center.

Hours after the attacks, Microsoft dispatched more than 30 security experts to an improvised Cantor Fitzgerald command center in Rochelle Park, N.J., roughly 20 miles from the rubble. Many of the missing passwords would prove to be relatively secure — the “JHx6fT!9” type that the company’s I.T. department implored everyone to choose. To crack those, the Microsoft technicians performed “brute force” attacks, using fast computers to begin with “a” then work through every possible letter and number combination before ending at “ZZZZZZZ.” But even with the fastest computers, brute-force attacks, working through trillions of combinations, could take days. Wall Street was not going to wait.

Microsoft’s technicians, Lutnick recalled, knew that they needed to take advantage of two facts: Many people use the same password for multiple accounts, and these passwords are typically personalized. The technicians explained that for their algorithms to work best, they needed large amounts of trivia about the owner of each missing password, the kinds of things that were too specific, too personal and too idiosyncratic for companies to keep on file. “It’s the details that make people distinct, that make them individuals,” Lutnick said. He soon found himself on the phone, desperately trying to compartmentalize his own agony while calling the spouses, parents and siblings of his former colleagues to console them — and to ask them, ever so gently, whether they knew their loved ones’ passwords. Most often they did not, which meant that Lutnick had to begin working his way through a checklist that had been provided to him by the Microsoft technicians. “What is your wedding anniversary? Tell me again where he went for undergrad? You guys have a dog, don’t you? What’s her name? You have two children. Can you give me their birth dates?”

“Remember, this was less than 24 hours after the towers had fallen,” he said. “The fire department was still referring to it as a search-and-rescue mission.” Families had not accepted their losses. Lutnick said he never referred to anyone as being dead, just “not available right now.” He framed his questions to be an affirmation of that person’s importance to the company, he said. Conversations oscillated between sudden bawling and agonizing silences. “Awful,” he said. Sometimes it took more than an hour to work through the checklist, but Lutnick said he made sure he was never the one to hang up first.

In the end, Microsoft’s technicians got what they needed. The firm was back in operation within two days. The same human sentimentality that made Cantor Fitzgerald’s passwords “weak,” ultimately proved to be its saving grace.

November 18, 2014

A Crisis in Cultural Confidence

I was struck by the theme underlying Smith’s article about the phenomena of young folks leaving home to join extremist movements like violent, radical jihadist groups such as Al Qaeda or ISIS, that being the deeper issue of a society’s confidence in its identity, values, and purpose.

Cultures, like the people that comprise them, begin to falter and ultimately fade away when they lose their sense of identity, become fixated on things trivial and ephemeral, and fail to foster and maintain a higher-order set of values or at least principles that frame behavior and expectations. Oddly, people are drawn to strength, confidence, purpose, and meaning but few people in ‘leadership roles’ or institutions actually provide such, preferring instead to appeal to baser desires to gain short-term wins.

The warning sounded by Smith: if we don't know who we are and what we stand for and are unable to make a compelling case for why our principles, standards, and way of life are better than others, then we shouldn't be surprised to see other cultures that are more confident begin to eclipse our own and our youth drawn to them.

Why the Teenage Girls of Europe Are Joining ISIS

Because they want the same things that teenage boys want: a strong sense of meaning and purpose

By Lee Smith|October 22, 2014 12:00 AM

Teenage girls are the West’s center of gravity: Virtually all of Western pop culture, the key to our soft power, is tailored to the tastes of teenage girls. Through the wonders of information technology, the mobile phone mass-produced the mores and habits of phone-mad teenage girls locked in their bedrooms. Indeed, Western civilization is a success largely insofar as it has made the world a safe place for teenage girls—to go to school, get a job, and decide who and when to marry, or if they want to marry. When teenage girls turn away from One Direction and embrace ISIS, it means the West is losing.

A Washington Institute poll last week showed that the Islamic State has more support in Europe than it does in the Middle East. The poll reported that only 3 percent of Egyptians, 5 percent of Saudis, and under 1 percent of Lebanese “expressed a positive opinion of the IS.” On the other hand, 7 percent of U.K. respondents had a favorable view of the group, as did 16 percent of French polled—with 27 percent of French citizens between 18-24 responding favorably.

The numbers should hardly come as a surprise. Thousands of young European Muslims have already left the continent for the Middle East to help the organization’s leader, Abu Bakr al-Baghdadi, build an authentic Islamic caliphate. Doubtless thousands more are on their way, to kill and die for an idea they believe in.

It is a striking fact that ISIS appeals not only to young men, but also young European women, many hundreds of whom have gone to Syria and Iraq to marry Islamic State fighters. Sure, some of them, like the 15-year-old French Jewish girl Nora el-Bathy, may have come to regret their decision. But that hardly alters the essential point: The girls sought out IS fighters because the West seems weak and unmanly and they pine for real men who are willing to kill and die for what they believe in.

Why? Europe’s got great health care, welfare, and lots of attractive young men and attractive women who, unlike the vast majority of women in the Middle East outside of Israel, are sexually available. So, why given a choice between a comfortable, if somewhat boring, life as a pharmacist in Hamburg, or fighting and dying in the desert, are thousands of Western Muslims opting for the latter?

Because, for all the awesome social services and consumer goods it can offer, Europe has become incapable of endowing the lives of its citizens, Muslim or not, with meaning. A generation of young European Muslims are giving up their relatively easy lives in Malmö, Marseilles, and Manchester for the battlefields of Syria and Iraq, because Europe is devoid of values worth living—or dying—for. They are leaving for the same reason that Europe’s Jews are moving to Israel: Strength and a sense of purpose can be found elsewhere, whether it’s ISIS, Vladimir Putin, Ali Khameni, or the IDF.

European security services are worried that the large number of jihadist fighters with Western passports are destined to cause trouble should they come back to the continent. They’re worried, they say, about the special skills militants might obtain abroad and then employ at home—like Mehdi Nemmouche, the Frenchman who killed four people at the Brussels Museum in May.

European authorities are missing the much more salient point. Nemmouche may have gone to Syria to fight alongside extremist groups, but it’s not like firing an automatic rifle is a specialized skill you can only learn on a jihadi battlefield. It’s not like you have to travel to the Middle East to learn to hate Jews. The problem isn’t what European Muslims may come back with from the Middle East, but the fact that they’ve left Europe in the first place. Baghdadi’s self-proclaimed caliphate sounds like an inside joke to IS’s two most significant military cadres—the Arab tribes, and former Baathists from Saddam Hussein’s regime. But to the Islamic State’s foreign fighters, especially its Western European contingent, the idea of a caliphate, ripped from the pages of Muslim history, resonates with a kind of existential authenticity missing from the vast and drab European suburbs warehousing Muslim youth.

And it’s precisely the violence of IS that appeals to the Europeans. For the Middle East, after all, despite Ayman al-Zawahiri’s alleged claims that IS is “too extreme” even for al-Qaida, there’s nothing exceptional about the bloodshed. The level of violence—beheadings, crucifixions, etc.—is par for the course in its regional politics. U.S. ally Saudi Arabia beheads criminals in the middle of Riyadh, and President Barack Obama’s new BFF in the region—an Iranian regime he calls rational—hangs criminals from construction cranes. But for the European fighters, the violence is more evidence of authenticity.

Yes, what IS stands for is exceedingly stupid and vicious—like one of the evil Transformer figures that destroys everything in its way. But this is what happens when there’s a vacuum: Ugly ideas fill space. Looking around, it’s hard not to think that the ugly, the vicious, and the stupid have the upper hand these days, with little resistance from the so-called defenders of the good.

Vladimir Putin is a hip-hop icon because he’s got Europe eating out of his hand—he rolls large and can turn off Europe’s lights any time he wants. He can go as far into Ukraine as he likes because he knows the United States won’t stop him. Obama said that Iran won’t get a nuclear weapon, but after already acknowledging the clerical regime’s right to enrich uranium, the White House may now allow Iran to keep even more centrifuges. Israel may have crushed Hamas over the course of a 40-day Operation Protective Edge, but here come the Western nations, led by the United States, hosting a donor conference that will relieve Hamas of all responsibility for having brought death and destruction to Gaza. Why? Because they can no longer summon the vitality necessary to take down a gang of bearded terrorists with RPGs, and so they are hoping instead to buy them off.

What Europe’s disaffected youth see is that the Western powers roll over and take it, again and again. The issue isn’t that we enjoy being humiliated. It’s just that we don’t really believe there’s anything worth fighting for. And that’s why thousands of Europe’s young Muslim men have taken sides against us—and why 15-year-old girls hold us in contempt.

Correction, October 23: French teenager Nora el-Bahty, a Muslim, was misidentified as Nora el-Bathy, Jew. The piece has been amended.


Lee Smith is a senior editor at the Weekly Standard and a senior fellow at the Hudson Institute. He is also the author of the recently published The Consequences of Syria.

"Where Congress went wrong, a candid conversation"

Here is a lengthy article but I encourage you to take the time to read it. Drawing from some first-hand experience in a 2011-2012 Congressional campaign, I can (sadly) report that things are every bit as bad as described below. The Primary is the only cycle that really matters because that tees-up the candidate from each Party for the General election, candidates who have been selected by the most extreme ends of either party and by voters (those few who make time to cast a vote) who choose their candidate almost exclusively on the image that has been projected by the campaigns…images and the projection of such made possible by money. No money means no image, no means to project it, and therefore no votes. The average citizen has minimal interest in politics (‘politics’ in the classic sense of the processes that lead to legislation), minimal understanding of issues, and zero interest in being involved in any meaningful way. Reporters who deal with political coverage are usually either cynical (the older ones) or just as uninformed (the younger ones) as the typical voter and so the reporting does little to differentiate candidates in any helpful way (which would mean something if folks actually took sufficient interest in paying attention in the first place).

Those involved in politics decry the current state of affairs and want Washington to “put aside the rancorous partisan bickering and come together to govern effectively” but then reject out of hand any effort by their own elected official to step outside of an narrowly defined position on any issue. To compromise or negotiate is to betray the true faith. I saw up close the vying for party/ideological purity, with candidates each proclaiming how they were more conservative, or party-pure, or local, or religious (especially Christian), or working class, or non-political than their competitor(s). There was almost no time given to actually talk about the particulars or complexities of any specific issue and no real interest by the public to entertain such. There is a comprehensively unrealistic expectation by those on the margins that they will ‘win’ by ceding no ground, driven by the belief that nearly everyone else along the Bell-curve will swing to the absolute position they have staked out near the end of the spectrum.

Successful candidates feed red meat to the mob and are most productive when spending their time fund-raising, something the author’s interviews bear out.

Some choice bits from the article that really resonated:
“With the way we draw districts, with so few competitive districts, we've bifurcated ourselves as a civilization.”

"You can't be moderate. Who votes in primaries? You have a 10 percent turnout in a primary election in Georgia, and Republicans are 30 percent of the population. So 10 percent of 30 percent—that's 3 percent of the population voting to choose the nominee, and then if it's a multiperson race, and the winner gets 35 percent, that's one third of 3 percent—1 percent of the population chooses the nominee, who in a gerrymandered district will be the eventual member of Congress.”

“Many of them argued that because conflict is rewarded with attention, more actual conflict is fostered, which is then amplified by social media, which blasts powerful narratives at members around the clock—who cares if they're true?—largely obscuring their meek attempts to actually get something done. All of that drives what most members think of as a perception gap between the way things are and the way they seem to be. The "twenty-four-hour news cycle" was mentioned by nearly every one of the members I interviewed as something that makes their lives hell and, more important, makes governing very hard. "It's the coliseum," says Joaquin Castro, Democrat of Texas. "And in the coliseum, people get hurt for sport."”

“And any spare moment that in the past may have been used to build trust between the members of Congress is now spent begging for money, particularly since the Citizens United Supreme Court ruling, which permitted unlimited spending by corporations or associations in support of political candidates. And it's not just "front line" members—those in tightly contested districts—who have to spend their allotment of hours per week at the call center, working donors. It's everybody. Some members report having to spend thirty hours a week on fundraising alone.”

“And with that, a final discovery: When you talk to so many members of Congress, you realize that those who are widely reviled can do much more damage than those who are widely respected can do good, and with half the effort.”
Some final thoughts of my own: If there is a solution, it lies in the hands of a citizenry who decides to care enough to spend some time actually understanding how and why their world works the way it does, to understand why short-term interests typically crowd-out long-term meaningful good, and who hold their elected representatives accountable for behaving responsibly -- aided by a press that decides to report the news and to conduct responsible investigative journalism instead of playing a partisan role that can be quite destructive. Again, reaching back to some recent experience, there were just a couple of reporters who took this stuff seriously...and one in particular who did a terrific job at legitimate investigative reporting. Unfortunately, too many people bought into the advertising campaigns and not enough into thinking about the solid journalism that was actually provided.

Hmmm...depending on the people...I'm reminded of Franklin's oft-quoted exchange with a lady following the close of the Constitutional Convention in Philadelphia in 1787. "Well, Doctor, what have we got--a Republic or a Monarchy?" "A Republic, if you can keep it."

It's really all about education, societal standards, and people giving a hoot. No shortcuts here.

I have had opportunities to have lengthy chats members of Congress, key staffers, and others deeply experienced in the political campaigning process. Their common observation: the public gets what it votes for.

Where Congress went wrong, a candid conversation

By Mark Warren

I spoke with ninety members of the House and Senate about what's gone so wrong in Congress. Sometimes it got a little emotional.

"I didn't get elected to Congress to not get things done—most people here want to get things done. I didn't get elected to Congress to make meaningless speeches on C-SPAN and tell lies about people. I didn't get elected to Congress to scare the hell out of the country and drive the sides further apart. I didn't get elected to Congress because I love politics—I hate politics, to be perfectly honest, and if I didn't before I got here, I do now… ."
The man is very angry, about the way his life is going, about Washington, about some things he has found himself saying that he wishes he could take back—he got carried away, total herd mentality, just so juvenile. People in public life should take stuff back more often, apologize more, and correct course more—now that would be making a real statement, maybe even be a breath of fresh air for the public. But he would just be screwing himself, he goes on, because those guys at Heritage Action or Club for Growth or Americans for Prosperity or some other goddamn group with an Orwellian name that thrives off of division and exists to create conflict might primary him, drop $3 million on his head, and he would be dead. And the way his district is drawn, you can't ever be conservative enough. He could get up at one of his town halls and say that the president is a transvestite Muslim from Mars and get a standing ovation. He wants to do the right thing and make a public stand for greater decency and civility in public life. But he can't. Oh, in his own quiet way he does. He has many friends who happen to be Democrats. "No matter what it seems, we don't hate each other," he says. "We are civil, we try to get to know each other, and most of us work hard to find areas of agreement, things that we can make progress on. People are stunned when I tell them that, because from the outside it just looks so bad."

At the same time, it's worse than he thought it would be before he was elected, the congressman says.


I saw the movie Fury a couple of weeks ago. Like any movie it had its strong and weak points but overall it did a superb job portraying the human element of war especially as it pertains to those involved in close combat and the hardening of attitudes among those who have seen and experienced too much for too long and still have more of a meat-grinding job ahead of them before their war is finished. If you saw and liked Das Boot (the 1981 movie about a WWII German submarine crew), you will be captivated by Fury in its portrayal of the claustrophobic intensity of tank-on-tank engagements. The special effects were superb, bringing the viewer into the gripping lethality of a battlefield. It isn't a pleasant movie to watch. Language is very rough but certainly what one hears among troops in the field and the movie is quite explicit in showing the casualties of battle but doesn’t dwell too long on any particular incident. It is much more intimate in focus than Spielberg's Saving Private Ryan and more condensed, obviously, than the HBO-series Band of Brothers. It’s a Hollywood movie so there are almost by default implausible aspects to it but all-in-all I thought its intensity, gritty reality, and personal story lines made it worth watching. If you've an interest in seeing what close-combat in conventional war is like, Fury is a good place to start. Just don't invite your little ones to the show.

November 17, 2014

How Do People Get New Ideas?

Here are some wonderful observations from Isaac Asimov about the futility of 'brainstorming sessions,' process and schedule-driven efforts to 'innovate' or develop new concepts, and the corrupting influence of bumper-sticker terms like game-changing, transformational, leap-head, and revolutionary…all of which should be stricken from our vocabulary. I understand why people in nearly every profession raise examples of extraordinary accomplishment for study and emulation. The basic idea is that if one just studies how someone else rose to the peak of their field in business, sports, the arts, what-have-you, then one might be able to copy or incorporate factors or characteristics critical to their success into one's own practices. I just don't think it's that simple. Certainly success can result from a number of interesting things not least of which is luck. Sometimes it is a consequence of being in the right place at the right time or the impossible-to-make-happen convergence of disparate threads that create a situation in a place and time that just can't be replicated. It is true that hard work, discipline, perseverance, study, and experience are essential and these sorts of traits and habits can be learned and improved. But 'extraordinary' is extraordinary for a reason--it is something special, something that is quite unique, and such occurrences are rare by definition. I think group efforts generally produce group-think and watered-down consensus products. The most amazing advances come from individuals who take the time to think, explore, experiment, and take risks with new ideas. But this calls for personal courage and a willingness to step outside of the mainstream to pursue something no one else is willing to do or has even thought of. If it occurs within an organization it is because that organization recognizes this and provides the freedom for such exploration. Something to think about when pursuing your own efforts, observing how institutions try to force innovation, or encouraging a young student who must find the courage to be different than his or her peers.

Isaac Asimov
October 20, 2014

Isaac Asimov Mulls “How Do People Get New Ideas?”

Note from Arthur Obermayer, friend of the author:

In 1959, I worked as a scientist at Allied Research Associates in Boston. The company was an MIT spinoff that originally focused on the effects of nuclear weapons on aircraft structures. The company received a contract with the acronym GLIPAR (Guide Line Identification Program for Antimissile Research) from the Advanced Research Projects Agency to elicit the most creative approaches possible for a ballistic missile defense system. The government recognized that no matter how much was spent on improving and expanding current technology, it would remain inadequate. They wanted us and a few other contractors to think “out of the box.”

When I first became involved in the project, I suggested that Isaac Asimov, who was a good friend of mine, would be an appropriate person to participate. He expressed his willingness and came to a few meetings. He eventually decided not to continue, because he did not want to have access to any secret classified information; it would limit his freedom of expression. Before he left, however, he wrote this essay on creativity as his single formal input. This essay was never published or used beyond our small group. When I recently rediscovered it while cleaning out some old files, I recognized that its contents are as broadly relevant today as when he wrote it. It describes not only the creative process and the nature of creative people but also the kind of environment that promotes creativity.


How do people get new ideas?

Presumably, the process of creativity, whatever it is, is essentially the same in all its branches and varieties, so that the evolution of a new art form, a new gadget, a new scientific principle, all involve common factors. We are most interested in the “creation” of a new scientific principle or a new application of an old one, but we can be general here.

One way of investigating the problem is to consider the great ideas of the past and see just how they were generated. Unfortunately, the method of generation is never clear even to the “generators” themselves.

But what if the same earth-shaking idea occurred to two men, simultaneously and independently?

Speed Kills -- Slowing Down in an Ever Faster World

This is a superb essay on the importance of slowing down in order to cope with and succeed in an increasingly faster-paced world. Unfortunately, you must slow down enough to actually read the article instead of quickly scanning it. In my opinion, the most important points are made in the latter half where the author discusses troubling trends affecting higher education.

What our world increasingly needs is to slow down enough to assess, understand, and contemplate matters so that solutions to problems are relevant, substantive, and have a shelf-life longer than the iPhone 5.

So, grab a cup of coffee or a nice cup of tea and take some time to mull-over this article and its implications.

Speed Kills

Fast is never fast enough

Speed Kills 1
Photograph by Stephen Wilkes courtesy of Peter Fetterman Gallery
"Sleeker. Faster. More Intuitive" (The New York Times); "Welcome to a world where speed is everything" (Verizon FiOS); "Speed is God, and time is the devil" (chief of Hitachi’s portable-computer division). In "real" time, life speeds up until time itself seems to disappear—fast is never fast enough, everything has to be done now, instantly. To pause, delay, stop, slow down is to miss an opportunity and to give an edge to a competitor. Speed has become the measure of success—faster chips, faster computers, faster networks, faster connectivity, faster news, faster communications, faster transactions, faster deals, faster delivery, faster product cycles, faster brains, faster kids. Why are we so obsessed with speed, and why can’t we break its spell?

The cult of speed is a modern phenomenon. In "The Futurist Manifesto" in 1909, Filippo Tommaso Marionetti declared, "We say that the splendor of the world has been enriched by a new beauty: the beauty of speed." The worship of speed reflected and promoted a profound shift in cultural values that occurred with the advent of modernity and modernization. With the emergence of industrial capitalism, the primary values governing life became work, efficiency, utility, productivity, and competition. When Frederick Winslow Taylor took his stopwatch to the factory floor in the early 20th century to increase workers’ efficiency, he began a high-speed culture of surveillance so memorably depicted in Charlie Chaplin’s Modern Times. Then, as now, efficiency was measured by the maximization of rapid production through the programming of human behavior.

With the transition from mechanical to electronic technologies, speed increased significantly. The invention of the telegraph, telephone, and stock ticker liberated communication from the strictures imposed by the physical means of conveyance. Previously, messages could be sent no faster than people, horses, trains, or ships could move. By contrast, immaterial words, sounds, information, and images could be transmitted across great distances at very high speed. During the latter half of the 19th century, railway and shipping companies established transportation networks that became the backbone of national and international information networks. When the trans-Atlantic cable (1858) and transcontinental railroad (1869) were completed, the foundation for the physical infrastructure of today’s digital networks was in place.

Fast-forward 100 years...

The Creepy New Wave of the Internet

Building upon my last post...This is a very good, balanced, and I believe sobering article I encourage you to take the time to read. I reject the idea that a trend simply extends linearly without deviation. People respond to changing conditions, often in unanticipated ways. The Internet of Things will indeed evolve but not in pure form as some advocates presume. I think there will be points or areas that generate a backlash from people who find that they do indeed want to retain some measure of privacy. I also believe there will always be competition and a divide between those who produce and those who primarily consume (makers vs takers), between those who contribute and those who continue to feed off of others, and between those who have ambition, creative capabilities, skills/talents, intellectual curiosity, etc., and those who do not. A utopian leveling of society enabled by technology is as fictional as the literary genre that explores such. I am much more concerned about the commercialization of personal data (whether the data is accurate in portraying a person or not) and its exploitation by people/entities/organizations for their own purposes with little awareness or control by the average person.

There are certainly implications for the extraordinary array of policies that pertain to all aspects of the evolving IoT from personal privacy rights, to commercial sector obligations/permissions/use, to government use that includes law enforcement, intelligence gathering, court access, tax collection enforcement, health care/insurance regulations, etc. Clearly, the technology writ large and its application is outpacing the various mechanisms that govern it.

The Creepy New Wave of the Internet

Penelope Umbrico/Mark Moore Gallery, Los Angeles A detail of Penelope Umbrico’s Sunset Portraits from 11,827,282 Flickr Sunsets on 1/7/13, 2013. For the project, Umbrico searched the website Flickr for scenes of sunsets in which the sun, not the subject, predominated. The installation, consisting of two thousand 4 x 6 C-prints, explores the idea that ‘the individual assertion of “being here” is ultimately read as a lack of individuality when faced with so many assertions that are more or less all the same.’ A collection of her work, Penelope Umbrico (photographs), was published in 2011 by Aperture.
Every day a piece of computer code is sent to me by e-mail from a website to which I subscribe called IFTTT. Those letters stand for the phrase “if this then that,” and the code is in the form of a “recipe” that has the power to animate it. Recently, for instance, I chose to enable an IFTTT recipe that read, “if the temperature in my house falls below 45 degrees Fahrenheit, then send me a text message.” It’s a simple command that heralds a significant change in how we will be living our lives when much of the material world is connected—like my thermostat—to the Internet.

It is already possible to buy Internet-enabled light bulbs that turn on when your car signals your home that you are a certain distance away and coffeemakers that sync to the alarm on your phone, as well as WiFi washer-dryers that know you are away and periodically fluff your clothes until you return, and Internet-connected slow cookers, vacuums, and refrigerators. “Check the morning weather, browse the web for recipes, explore your social networks or leave notes for your family—all from the refrigerator door,” reads the ad for one.

Welcome to the beginning of what is being touted as the Internet’s next wave by technologists, investment bankers, research organizations, and the companies that stand to rake in some of an estimated $14.4 trillion by 2022—what they call the Internet of Things (IoT). Cisco Systems, which is one of those companies, and whose CEO came up with that multitrillion-dollar figure, takes it a step further and calls this wave “the Internet of Everything,” which is both aspirational and telling. The writer and social thinker Jeremy Rifkin, whose consulting firm is working with businesses and governments to hurry this new wave along, describes it like this:
The Internet of Things will connect every thing with everyone in an integrated global network. People, machines, natural resources, production lines, logistics networks, consumption habits, recycling flows, and virtually every other aspect of economic and social life will be linked via sensors and software to the IoT platform, continually feeding Big Data to every node—businesses, homes, vehicles—moment to moment, in real time. Big Data, in turn, will be processed with advanced analytics, transformed into predictive algorithms, and programmed into automated systems to improve thermodynamic efficiencies, dramatically increase productivity, and reduce the marginal cost of producing and delivering a full range of goods and services to near zero across the entire economy.
In Rifkin’s estimation, all this connectivity will bring on the “Third Industrial Revolution,” poised as he believes it is to not merely redefine our relationship to machines and their relationship to one another, but to overtake and overthrow capitalism once the efficiencies of the Internet of Things undermine the market system, dropping the cost of producing goods to, basically, nothing. His recent book, The Zero Marginal Cost Society: The Internet of Things, the Collaborative Commons, and the Eclipse of Capitalism, is a paean to this coming epoch.

It is also deeply wishful, as many prospective arguments are, even when they start from fact. And the fact is, the Internet of Things is happening, and happening quickly. Rifkin notes that in 2007 there were ten million sensors of all kinds connected to the Internet, a number he says will increase to 100 trillion by 2030. A lot of these are small radio-frequency identification (RFID) microchips attached to goods as they crisscross the globe, but there are also sensors on vending machines, delivery trucks, cattle and other farm animals, cell phones, cars, weather-monitoring equipment, NFL football helmets, jet engines, and running shoes, among other things, generating data meant to streamline, inform, and increase productivity, often by bypassing human intervention. Additionally, the number of autonomous Internet-connected devices such as cell phones—devices that communicate directly with one another—now doubles every five years, growing from 12.5 billion in 2010 to an estimated 25 billion next year and 50 billion by 2020.

For years, a cohort of technologists, most notably Ray Kurzweil, the writer, inventor, and director of engineering at Google, have been predicting the day when computer intelligence surpasses human intelligence and merges with it in what they call the Singularity. We are not there yet, but a kind of singularity is already upon us as we swallow pills embedded with microscopic computer chips, activated by stomach acids, that will be able to report compliance with our doctor’s orders (or not) directly to our electronic medical records. Then there is the singularity that occurs when we outfit our bodies with “wearable technology” that sends data about our physical activity, heart rate, respiration, and sleep patterns to a database in the cloud as well as to our mobile phones and computers (and to Facebook and our insurance company and our employer).

Cisco Systems, for instance, which is already deep into wearable technology, is working on a platform called “the Connected Athlete” that “turns the athlete’s body into a distributed system of sensors and network intelligence…[so] the athlete becomes more than just a competitor—he or she becomes a Wireless Body Area Network, or WBAN.” Wearable technology, which generated $800 million in 2013, is expected to make nearly twice that this year. These are numbers that not only represent sales, but the public’s acceptance of, and habituation to, becoming one of the things connected to and through the Internet.

One reason that it has been easy to miss the emergence of the Internet of Things, and therefore miss its significance, is that much of what is presented to the public as its avatars seems superfluous and beside the point. An alarm clock that emits the scent of bacon, a glow ball that signals if it is too windy to go out sailing, and an “egg minder” that tells you how many eggs are in your refrigerator no matter where you are in the (Internet-connected) world, revolutionary as they may be, hardly seem the stuff of revolutions; because they are novelties, they obscure what is novel about them.

And then there is the creepiness factor...


I've finally freed-up some time to resume posting items to this blog. To those who have been checking in from time to time, I deeply appreciate your patience and perseverance. What better way to restart this thing than to share some items pertaining to our rapidly evolving big-data-world.
Several months ago I decided the time had come to bid farewell to FaceBook, so deleted my account. and posted a version of what follows to explain my rationale. I am no neo-Luddite; in fact I greatly enjoy the incredible array of modern wonders made possible by technology and I've every confidence our future will be just as amazing to us as our time would be to someone from even the not too distant past. But I'm increasingly concerned about the amount of personal information that is now stored in various forms by all manner of entities from government to commercial. 
Usually there is a well-meaning, useful, sometimes even noble purpose for data collection at the outset. Companies like to collect data on people so they can more effectively sell products. The medical community likes to collect data so it can better understand all aspects of health with the intent of providing more effective treatment and even preventive services for higher daily quality of life. Educators like to collect data so they can better understand the learning process and adjust education efforts to be more relevant and effective perhaps even at the individual level. Government agencies presumably like to collect data so they can more effectively use taxpayer monies (at least that's their argument) or, in the intel and law enforcement communities, more rapidly acquire awareness of threats/dangers in the hope of preventing bad things from happening. I get that. But very quickly the ideal clashes with reality. The original intent doesn't account for the nature of people.

Insurance companies seek health information so they can adjust coverage. Let's say advances in genetics allow a doctor to tell you that you have x-probability of developing a serious health problem. Does your insurance company increase your premium or drop your coverage altogether? Let's say behaviorists develop an algorithm they believe predicts with high likelihood an individual's potential for some action. Does law enforcement act on that with the intent to prevent a crime even though no crime has yet been committed? Educators seek insight into home environments through questionnaires administered at school but the children who complete them have no ability to provide context to answers for questions whose wording might reflect a philosophical bias of the person who developed the questionnaire in the first place. Does the school act with law enforcement or social services if they conclude there's something amiss even though that conclusion is based on inherently flawed assumptions?

It is a truth that people begin with one idea in mind but when conditions change and new opportunities emerge their objectives and perspectives change too. The people collecting information may have one idea in mind at the beginning, but after compiling loads of data they tend to find other things that can be done with it that just wasn't imagined when the project was started. Government and commercial entities now have access to personal preferences, real time location tracking, online behavior (websites visited, items purchased, times of activities, networks of contacts, etc.), and religious, political social, and economic beliefs -- able to be derived if not explicitly stated. As I said, I'm not a Luddite nor am I a conspiracy theorist, technophobe, or anti-government activist. I'm just a guy who likes his privacy, recognizes that the human factor is fundamental to everything we do, and who abhors the idea that free will might be replaced by predictive software.

Humans 'reply all' when they shouldn't. They compromise sensitive information. They lose hard-drives full of social security numbers and send out personal contact information and political donation histories when they shouldn't. They exploit their ability to use tools to which they have access to satisfy voyeuristic tendencies (at the benign level) or to gain advantage over rivals (at the nefarious level). They interpret data to suit their biases or to promote agendas. It happens all the time.

I'll continue to use a cellphone and shop online from time to time. And I'll continue posting to this blog; most of the stuff I posted to FaceBook was better suited to a blog format anyway. But I'm ever more reluctant to willingly contribute all the little bits and pieces of my life to the big pool of metadata over which I have no control any more than I have to. 
Google's mission statement is extraordinarily noble: "to organize the world’s information and make it universally accessible and useful." It's beyond my ability to imagine all that might be accomplished to the good of humanity when taking that statement at face value. And I believe there was noble purpose in it when conceived just like all the other noble ideas that spur people to try to advance the human condition. Facebook similarly has lofty ideas about creating an environment through which people can connect with each other regardless of physical location. I think FB currently has 1.3 billion users making it the third largest population group in the world (after India and China). There are other examples. But I just can't get past the fact that behind it all are people who can be honest, ethical, well-meaning, and noble one moment but also carry with them all the weaknesses, foibles, ambitions, and ignoble character flaws that make us human. It’s just the way it is.

I’m reminded of Robert Frost’s poem “Mending Wall” with the immortal line "Good fences make good neighbours." There is a great deal of wisdom in that phrase and a deep understanding of the human condition.