电铝理店

Electric Aluminum Truth Store

IBM computer a guest in the game show Jeopardy!

  Source

Computer could make 2 'Jeopardy!' champs deep blue

AP

By JIM FITZGERALD and DAVID R. MARTIN, Associated Press Jim Fitzgerald And David R. Martin, Associated Press – Fri Jan 14, 8:55 am ET

YORKTOWN HEIGHTS, N.Y. – The clue: It's the size of 10 refrigerators, has access to the equivalent of 200 million pages of information and knows how to answer in the form of a question.

The correct response: "What is the computer IBM developed to become a `Jeopardy!' whiz?"

Watson, which IBM claims as a profound advance in artificial intelligence, edged out game-show champions Ken Jennings and Brad Rutter on Thursday in its first public test, a short practice round ahead of a million-dollar tournament that will be televised next month.

Later, the human contestants made jokes about the "Terminator" movies and robots from the future. Indeed, four questions into the round you had to wonder if the rise of the machines was already upon us — in a trivial sense at least.

Watson tore through a category about female archaeologists, repeatedly activating a mechanical button before either Ken Jennings or Brad Rutter could buzz in, then nailing the questions: "What is Jericho?" "What is Crete?"

Its gentle male voice even scored a laugh when it said, "Let's finish 'Chicks Dig Me.'"

Jennings, who won a record 74 consecutive "Jeopardy!" games in 2004-05, then salvaged the category, winning $1,000 by identifying the prehistoric human skeleton Dorothy Garrod found in Israel: "What is Neanderthal?"

He and Rutter, who won a record of nearly $3.3 million in prize money, had more success on questions about children's books and the initials "M.C.," though Watson knew about "Harold and the Purple Crayon" and that it was Maurice Chevalier who sang "Thank Heaven for Little Girls" in the film "Gigi." The computer pulled in $4,400 in the practice round, compared with $3,400 for Jennings and $1,200 for Rutter.

Watson is powered by 10 racks of IBM servers running the Linux operating system. It's not connected to the Internet but has digested encyclopedias, dictionaries, books, news, movie scripts and more.

The system is the result of four years of work by IBM researchers around the globe, and although it was designed to compete on "Jeopardy!" the technology has applications well beyond the game, said John Kelly III, IBM director of research. He said the technology could help doctors sift through massive amounts of information to draw conclusions for patient care, and could aid professionals in a wide array of other fields.

"What Watson does and has demonstrated is the ability to advance the field of artificial intelligence by miles," he said.

Watson, named for IBM founder Thomas J. Watson, is reminiscent of IBM's famous Deep Blue computer, which defeated chess champion Garry Kasparov in 1997. But while chess is well-defined and mathematical, "Jeopardy!" presents a more open-ended challenge involving troves of information and complexities of human language that would confound a normal computer.

"Language is ambiguous; it's contextual; it's implicit," said IBM scientist David Ferrucci, a leader of the Watson team. Sorting out the context — especially in a game show filled with hints and jokes — is an enormous job for the computer, which also must analyze how certain it is of an answer and whether it should risk a guess, he said.

The massive computer was not behind its podium between Jennings and Rutter; instead it was represented by an IBM Smart Planet icon on an LCD screen.

The practice round was played on a stage at an IBM research center in Yorktown Heights, 38 miles north of Manhattan and across the country from the game show's home in Culver City, Calif. A real contest among the three, to be televised Feb. 14-16, will be played at IBM on Friday.

The winner of the televised match will be awarded $1 million. Second place gets $300,000, third place $200,000. IBM, which has headquarters in Armonk, said it would give its winnings to charity while Jennings and Rutter said they would give away half theirs.

In a question-and-answer session with reporters after the practice round, Rutter and Jennings made joking reference to the jump in technology Watson represents.

"When Watson's progeny comes back to kill me from the future," Rutter said, "I have my escape route planned just in case."

Jennings said someone suggested his challenge was like the legend of John Henry, the 19th-century laborer who beat a steam drill in a contest but died in the effort. Jennings prefers a comparison to "Terminator," where the hero was a little more resilient.

"I had a friend tell me, 'Remember John Henry, the steel-drivin' man.' And I was like ... 'Remember John Connor!'" Jennings said. "We're gonna take this guy out!"

___

Associated Press writer Leon Drouin-Keith in New York City contributed to this report.


Source

IBM's Watson on 'Jeopardy': Computer takes big lead over humans in Round 2

February 15, 2011 | 9:20 pm

On Tuesday night's "Jeopardy" episode, Watson, the IBM supercomputer, steamrollered to a commanding lead over his human competitors.

Watson missed just one question before the first commercial break, getting 14 of the 15 questions correct. The lone question Watson missed was answered correctly by former "Jeopardy" champion Ken Jennings.

Jennings and Brad Rutter, also a former champ on the game show, eked out just five correct responses during the second round of the challenge, which began Monday and ends Wednesday night.

By the end of the episode, Watson amassed $35,734 in winnings, with Rutter ascending to second place at $10,400 and Jennings (who is a record holder with 74 straight matches of "Jeopardy" previously won) having earned just $4,800.

Watson and Rutter ended Monday night's episode in a tie with $5,000 in winnings each and Jennings at $2,000. With Jennings behind after the first round, he chose the first category Tuesday night to start off the show.

But Watson was too quick for Jennings and Rutter in the first question and for a large part of the night.

Just as on Monday, Watson was competitive yet showed that there was plenty of room for improvement. A skill that Watson lacks is making decent bets.

When the IBM computer landed a Daily Double question in the Cambridge category, it bet $6,435.

Host Alex Trebek shook his head saying, "I won't ask," gaining a laugh from the audience.

Watson got the right answer -- "Who is Sir Christopher Wren?" -- and boosted its earnings to $21,035.

One moment of man over machine came in the Final Jeopardy question, which both Jennings and Rutter got right and Watson got wrong.

To "Its largest airport is named for a World War II hero; its second largest, for a World War II battle" the humans correctly submitted the question "What is Chicago?"

Watson, who bet only $947 in the final question, submitted "What is Toronto?????" with the five question marks implying that the computer was unsure of its answer.

On Wednesday night the three will face off in the final round of the match, dubbed the IBM Challenge, with the winner taking home $1 million. If Jennings or Rutter win, half the money will go to charity, and if Watson is victorious, all of the cash will be donated.


Source

Watson plays 'Jeopardy!' well, but what else can it do?

By Dan Vergano, USA TODAY

By Carol Kaelson, Jeopardy Productions, via AP

Watson won decisively after three days against Ken Jennings, left, and Brad Rutter on Jeopardy!

Watson, the IBM computer created to be a champ-challenging contestant on Jeopardy!, chewed up and spat out humanity's finest knights of trivia Wednesday night in a $77,147 drubbing of past record winners Ken Jennings ($24,000 total) and Brad Rutter ($21,600).

"I sort of felt like I wanted to win here as badly as I ever have before. This is like the dignity of the species," Jennings told CBS Evening News after the first night's show.

So has the fatal day of reckoning arrived, foretold by Frankenstein, 2001: A Space Odyssey and, of course, The Terminator series of man-vs.-machine movies? Even trivia, one of humanity's favorite pursuits, has yielded to the mighty computer.

What is "not so much"?, Alex.

"It is clear (Watson) was designed to play Jeopardy! very well," says computer intelligence expert John Laird of the University of Michigan in Ann Arbor, both in its vast erudition — answering Saturday Night Live's "Church Lady" in one question — and its "impressive" accuracy. But, he adds, "I think IBM could have a challenge to move this to other fields."

IBM is trying. Last week, the Armonk, N.Y.-based computer-services titan announced partnerships with eight universities, including MIT and the University of Texas, to explore new uses of Watson's "Deep Question" technology. The process lets Watson sift through and weigh its confidence in answers to complex questions. IBM's David Ferrucci says the company sees future uses for Watson in automating questions for health care and legal aid.

IBM also Thursday announced a research partnership with Nuance Communications Inc. of Burlington, Maine, maker of the popular Dragon speech recognition software, to develop a physician's assistant technology based on Watson. Within a year, they hope to test mobile devices equipped with a medical knowledge version of Watson to help with diagnoses and treatment recommendations as physicians make rounds in hospitals connected to medical schools at Columbia University and the University of Maryland. "This is not about games," says IBM research executive John Kelly. "We want to apply this technology to benefit medicine in a way that really matters." Computer vs. brain

Feeling a little computer envy? Don't let IBM's Watson, the champ-crushing computer on Jeopardy!, get you down. A comparison with your own human brain:

But even amid Watson's triumph, some answers it blew point to problems, say experts such as machine-learning pioneer Douglas Lenat of Cycorp in Austin. "Like a human idiot savant, it would get wrong a large fraction of things that almost all sane adult humans would get right," Lenat says. "For instance, if the category were 'Uphill vs. Downhill,' and the clue was 'The direction that beer flows.' " (Watson doesn't understand gravity as a concept, and few documents it searches are likely to discuss the propensity of beer to flow downhill.)

Then there was the "Final Jeopardy" question Tuesday in the "U.S. cities" category, answering which city had airports named for both a World War II hero and battle. Watson chose "Toronto???" instead of Chicago (O'Hare and Midway).

Ferrucci defends Watson by saying it answered this way with only 14% confidence because it had to. "Just because a Jeopardy! category says 'U.S. cities' does NOT mean the answer is (actually the name of) a U.S. city," Ferrucci says by e-mail.

Building off decades of artificial-intelligence research, Watson pursues one well-trod path — broadly analyzing vast libraries of text for answers — rather than relying on deep structural knowledge of a very narrow area, such as lunar geology, seen in so-called "expert" systems, says Ellen Voorhees of the National Institute of Standards and Technology in Gaithersburg, Md. Who is best at the buzzer?

Some question whether Watson enjoys an unfair buzzer-beating advantage over its human opponents on Jeopardy!. On the show, a question is displayed and host Alex Trebek reads the question aloud. Only when he stops can contestants buzz in to give their answer.

So does Watson have an advantage in having the text of the question delivered the same instant that human players can see the question?

"Oddly enough, humans have an advantage because they can anticipate when the buzzers are enabled by listening to the clue and timing it," says IBM’s David Ferrucci. In contrast, Watson uses an? algorithm to guess when it is safe to buzz, he says, costing time. But people often mis-anticipate Trebek’s timing, "pre-buzzing," which locks them out for 0.25 seconds, long enough to lose.

"It is a significant advance," she says, but it doesn't have both the broad search and deep knowledge of an ideal system.

"Compared to a human brain, Watson doesn't even come close in computational power," says information scholar Martin Hilbert of the University of Southern California. "If only we put as much effort in educating human brains as we spent on computers." Watson required the work of more than 20 researchers over four years to develop.

Anxiety about Watson says more about people than computers, says Andrew Meltzoff of the University of Washington in Seattle, an expert on robot-human interaction. "To us, potentially losing to a computer raises issues of 'Who am I? What does it mean to be human?' " he says. "The Jeopardy! event is hype for humans. The computer doesn't care. The debate is a clue to what makes us human."

 

Home

Electric Aluminum Truth Store