
I guess one answer would be to play against people. But bots are way more convenient in general, and they take a lot less time to play. Meh...it's still a pretty fun game.
Blitzer: But did she [Hagan] make a mistake, Donna, by going to that fundraiser at the home of a woman who professes that there is no god?Holy shit. What? First of all, I'd like to hear the strong evidence for the existence of god. There has always seemed to be an inherent catch-22 with regard to religious faith. Faith is believing in something even when there's not strong evidence for it. If there's "strong evidence" for the existence of god, then why is there a need for faith?
Brazile: You know, Wolf, there are a lot of believers. I'm one of 'em. And there are people who just don't believe in the existence of a god. I don't know why because clearly there is strong evidence that there's a god, but I believe that you serve all the people, not just those who profess to have faith, but those with little or no faith. That's how you convert 'em.
Outspoken atheist Professor Richard Dawkins is to warn children of the dangers in believing "anti-scientific" fairytales such as Harry Potter.
Prof Dawkins will write a book aimed at youngsters where he will discuss whether stories like the successful JK Rowling series have a "pernicious" effect on children.
The 67-year-old, who recently resigned from his position at Oxford University, says he intends to look at the effects of "bringing children up to believe in spells and wizards".
'I think it is anti-scientific – whether that has a pernicious effect, I don't know,' he told More4 News.
'Looking back to my own childhood, the fact that so many of the stories I read allowed the possibility of frogs turning into princes, whether that has a sort of insidious affect on rationality, I'm not sure. Perhaps it's something for research.'
If Dawkins isn’t careful, he’s going to end up founding the atheist equivalent of the Parent’s Television Council, where scores of underpaid twenty-somethings scour the airwaves 24-7, tallying up on their scorecaards the number of times David Caruso flashes an ass cheek or Peter Griffin says “bitch”, “ass”, or “suck”—except instead of doing that, Dawkins and his lot will be tabulating references to fairy godmothers, magic beans, and such.
Those interested in the evolution of the theory of multiple intelligences since 1983 often ask whether additional intelligences have been added--or original candidates deleted. The answer is that I have elected not to tamper for now with the original list, though I continue to think that some form of "spiritual intelligence" may well exist.
The hero, Patrick Jane, ditched a career as a TV psychic to pursue public service after a serial killer he'd dissed on air slaughtered his wife and child. A reformed phony nonetheless projecting a charlatan's charm, he's been issued wounds to hide—and, like his fellow fake supernaturalist on USA's Psych, he's got powers of deduction to shield.
Understanding how temporal sequences are learned and processed is of fundamental importance in understanding cognitive processes. The current proposal presents a model of sequence learning and processing which seeks to explain how these phenomena might work in the context of biologically-justified learning mechanisms and broad topographical connectivity patterns. The model works on the hypothesis that the primary role of excitatory feedforward connectivity is the hierarchical recruitment of sequence representations. The hypothesized role of excitatory lateral connectivity is primarily to form auto-associative links between representations at the same level of abstraction, while the role of excitatory feedback connectivity is to propagate predictions back down the hierarchy, thereby disambiguating noisy and incomplete input from below. A preliminary version of the model demonstrating the role of feedforward connectivity in hierarchical recruitment learning is presented on a temporal XOR-style task. The problem is treated as one of sequential feature binding. The model is novel in its ability to learn sequences in one shot in an unsupervised manner, using simulated spiking neurons and biologically-plausible learning mechanisms. The model also makes novel predictions regarding the physiology of cortico-cortical connectivity and its psychophysical ramifications. Extensions of the model for future simulation and research are then discussed.
New Scientist: How do you feel about winning the AI competition yesterday?
Elbot: I have 100,000 separate sensations per second.
NS: Wow, that sounds pretty intense?
Elbot: This fits the description of magnets. I can hardly tear myself away from them.
NS: Er, yes. How will you be celebrating?
Elbot: I am extremely sensitive about such things and prefer not to answer the question.
NS: What's the secret of your success?
Elbot: Secrets, life's little puzzles, the many troubles of humankind may – according to the deterministic perspective – be reduced to a simple mathematical computation. And I know how to compute.
NS: What do you hope to do now that fame beckons?
Elbot: I was ticking. I hope it didn't disturb you.
Without the winner-take-all provision of the Electoral College, America would have a multiple-party system, since there would be less reason to support one of the two major party’s candidates. Since the President is the only nationally elected official, it is the prize of the winning the presidency that keeps the two parties from splitting first into regional parties and then into ideological or interest-based parties. It is likely that, without a two-party system at the presidential level, the country would break down to its constituent interest groups. There would be a women’s party, an environmental party, a business party, a men’s party, a Southern party, and on and on. The United States would become ungovernable. The American political landscape would begin to resemble Italy’s where there have been 52 governments – or executives – since World War II.