| Index | Recent Threads | Unanswered Threads | Who's Active | Guidelines | Search |
| World Community Grid Forums
|
| No member browsing this thread |
|
Thread Status: Active Total posts in this thread: 18
|
|
| Author |
|
|
twilyth
Master Cruncher US Joined: Mar 30, 2007 Post Count: 2130 Status: Offline Project Badges:
|
The now famous AI that won the Jeopardy challenge a couple of years ago will now be made available to developers via the cloud.
----------------------------------------IBM is preparing to give third parties access to its Watson supercomputer with the aim of spurring the growth of applications that take advantage of the system's artificial intelligence capabilities. Watson, which is derived from IBM's DeepQA project, drew worldwide attention in 2011 after it soundly defeated human opponents on the Jeopardy! game show. IBM has been applying Watson's machine learning -- or "cognitive computing" -- technology to domains such as health care, but now the company is ready to share Watson with the broader world. "We've been developing, evolving and maturing the technology," said Rob High, an IBM fellow who serves as CTO of Watson, in an interview. "It's stable and mature enough to support an ecosystem now. We've become convinced there's something very special here and we shouldn't be holding it back." Watson has "come a long way" since the Jeopardy! competition, High said. IBM decided to focus on health care initially because of the industry's "particularly challenging" linguistic qualities. "We thought if we could master that, it would open the door for other domains," he said. Full story at link ![]() ![]() |
||
|
|
RicktheBrick
Senior Cruncher Joined: Sep 23, 2005 Post Count: 206 Status: Offline Project Badges:
|
Here is a link to an article http://www.nytimes.com/2013/11/14/technology/...e-internet.html?_r=2&. Here is a quote from the article " Watson is prominent, but similar projects are being run by other companies. On Tuesday, a company appearing at the Amazon conference said it had run in 18 hours a project on Amazon’s cloud of computer servers that would have taken 264 years on a single server.
The project, related to finding better materials for solar panels, cost $33,000, compared with an estimated $68 million to build and run a similar computer just a few years ago. " I believe that is what clean energy project is suppose to be doing. Now the question is did it cost more or less than $33,000 to distribute and collect that many results from the members. With the cost of computing constantly going down there has to be a time when it will be cheaper to produce rather than distribute. Even today I wonder if IBM controls all the computer that are being used to generate their results. Because if they do control them one would assume that they do not need to verify their results and would know for certain that they are going to be accomplished. That would make their 250,000 results a day contribution equivalent to up to 500,000 results from other members. I am sure that the same work is being sent to more than one member to both ensure that they do get a result and to compare the results from more than one member to ensure their correctness. Cloud computing is the opposite of distributing computing so I think it will be in IBM's interest to prove that they can do this work cheaper than relying on volunteers. So I do not think there will be too much more time until IBM stops this project. |
||
|
|
RicktheBrick
Senior Cruncher Joined: Sep 23, 2005 Post Count: 206 Status: Offline Project Badges:
|
The $33,000 should also include the cost of electricity to the members and the cost of wear and tear to their computers. It would be in their interest to contribute money if that money could produce far more results.
|
||
|
|
twilyth
Master Cruncher US Joined: Mar 30, 2007 Post Count: 2130 Status: Offline Project Badges:
|
I'm not sure how you're arriving at your conclusions. New supercomputers use the same chips as are used in commercial servers and those have the same architecture, essentially, as desktops. So on a per core basis, I don't see them being all that much more efficient.
----------------------------------------Of course those chips will tend to be newer, but at a couple thousand dollars a piece, there is the cost factor to consider as well, not just efficiency. ![]() ![]() |
||
|
|
RicktheBrick
Senior Cruncher Joined: Sep 23, 2005 Post Count: 206 Status: Offline Project Badges:
|
I have read an article about the fastest supercomputer and I have given links to them. Anyway the fastest supercomputer does a billion flops per watt. To compare that with my computer that has 6 cores or about 15 billion flops would mean that it would run on just 15 watts. I know that these computers have at least a 450 watt power supply so there is no comparison. I am sure that a power supply for a supercomputer with thousand of cores would be much more efficient than thousands of power supplies for even thousands of 6 core computers. Nvidia claims that their gpu system will get 30 billion flops per watt by 2020 so the efficiency of supercomputers will get a lot better in the next 6 years. The goal of the supercomputers is to get to a billion billion flops by 2020. If they do not get it better than a billion flops per watt that supercomputer would require a billion watts to run. That is more than a lot of power plant being built today especially if they are not nuclear. So unless they want to build a power plant with each supercomputer they will have to get more efficient. So unless I see a 4GHz 6 core computer that will run on a dc adapter, I would think that cloud computing is better than distributing computing.
|
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
You can buy an NVidia 780Ti if you like. It's a 5.2 TFLOP card with a 250W TDP (just over 20 GFLOPs/W). If you're on the red team, you might prefer an AMD 290X delivering 5.6 TFLOPs. In a datacentre you would add the air conditioning's power requirements. I'd open a window during the summer, and use it to offset conventional heating during the winter.
You can't use them at WCG currently of course, but Folding@Home or GPUGrid would have you. |
||
|
|
BladeD
Ace Cruncher USA Joined: Nov 17, 2004 Post Count: 28976 Status: Offline Project Badges:
|
Here is a link to an article http://www.nytimes.com/2013/11/14/technology/...e-internet.html?_r=2&. Here is a quote from the article " Watson is prominent, but similar projects are being run by other companies. On Tuesday, a company appearing at the Amazon conference said it had run in 18 hours a project on Amazon’s cloud of computer servers that would have taken 264 years on a single server. I think the only thing similar between Watson and that project is that they both can run in the clouds. ![]() |
||
|
|
twilyth
Master Cruncher US Joined: Mar 30, 2007 Post Count: 2130 Status: Offline Project Badges:
|
I have read an article about the fastest supercomputer and I have given links to them. Anyway the fastest supercomputer does a billion flops per watt. To compare that with my computer that has 6 cores or about 15 billion flops would mean that it would run on just 15 watts. I know that these computers have at least a 450 watt power supply so there is no comparison. I am sure that a power supply for a supercomputer with thousand of cores would be much more efficient than thousands of power supplies for even thousands of 6 core computers. Nvidia claims that their gpu system will get 30 billion flops per watt by 2020 so the efficiency of supercomputers will get a lot better in the next 6 years. The goal of the supercomputers is to get to a billion billion flops by 2020. If they do not get it better than a billion flops per watt that supercomputer would require a billion watts to run. That is more than a lot of power plant being built today especially if they are not nuclear. So unless they want to build a power plant with each supercomputer they will have to get more efficient. So unless I see a 4GHz 6 core computer that will run on a dc adapter, I would think that cloud computing is better than distributing computing. I don't know what links you're talking about, but from everything I've read, even the newest Cray machines use off the shelf chips, so my comment about using the same architecture stands until you can prove otherwise. I'll look forward to that. And if you're comparing GPU flops to CPU flops, well, all I have to say that is - really? I think everyone here is well aware of that disparity and also of the fact that the chips being used in high performance computing environments are also the same as what gamers are using in their rigs, so again, the point stands. Finally, I see that you're happy to completely ignore the cost issue. Well, I'm not and neither are the people who need numbers crunched. So unless you have a extra trillion bucks kicking around in your pocket that you want to donate to the cause, grid computing is still looking pretty darn tasty. ![]() ![]() |
||
|
|
twilyth
Master Cruncher US Joined: Mar 30, 2007 Post Count: 2130 Status: Offline Project Badges:
|
It looks like IBM will be releasing a less powerful cousin to Watson - Neo :
----------------------------------------IBM recently gave analysts an in-depth look at the products and underlying strategy of its extensive business analytics (BA) portfolio and a product that caught our eye is IBM’s recently announced visual data discovery tool, codenamed “Project Neo.” The solution is the result of IBM Labs coalescing some of the company’s latest in-memory database, analytics, data visualisation, and design initiatives. Business analytics is front and centre for IBM. Investments of nearly $US20 billion into R&D and M&A activity have left IBM with an impressive (and extensive) list of BA solutions. However, IBM has been somewhat underrepresented in the increasingly competitive market of visual data discovery solutions. Vendors such as Tableau, Tibco Spotfire, and QlikTech have led the way with intuitive solutions that have attracted a drove of customers who are looking for intuitive and visual ways to interact with their business data. But IBM is now upping its ante in the visual data discovery market by making Project Neo available as a beta program in early 2014. The stated goal for the solution is similar to that of its data discovery competitors: to simplify the analysis and understanding of data for the nontechnical business user. But IBM’s approach is different – it aims to do so by allowing business users to ask questions of raw tabular data sets in plain English. The UI is simple and contains only a single Google-like free-text search bar where a user can ask questions such as “Why are my sales down in Asia?” or “Will higher education create a better employee?” Behind the scenes, Project Neo automates the process of data modeling, advanced and predictive data analysis, and the creation of data visualisations. The end result is interactive visualisations and natural language explanations of what was discovered ![]() ![]() |
||
|
|
twilyth
Master Cruncher US Joined: Mar 30, 2007 Post Count: 2130 Status: Offline Project Badges:
|
This is really getting exciting . Try to read the whole article.
----------------------------------------For instance, IBM researchers fed the entirety of UrbanDictionary.com, an online dictionary of slang terms, to expand Watson’s vocabulary. Their experiment worked as planned. But Watson began integrating foul language and slang into responses. According to an IBM Researcher: “Watson couldn’t distinguish between polite language and profanity – which the Urban Dictionary is full of. It picked up some bad habits from reading Wikipedia as well. In tests it even used the word ‘bulls***’ in an answer to a researcher’s query.” Researchers then washed Watson’s mouth out with soap by scraping the dictionary data from its memory. Cognitive computing, predictive analytics, natural language processing and machine learning are now available to developers outside of IBM. What was once only available to large corporations with extensive technology investments, will soon be in the hands of everyday consumers and businesses alike. Mobile Supercomputers With Apps to Match By introducing such capabilities for consumer and enterprise applications, business users and every day people will have the power of not just a computer in their pocket, but a super-computer with highly cognitive functionality. A recent report from Gartner even suggests that smartphones will become smarter than users by 2017: “If there is heavy traffic, it will wake you up early for a meeting, or simply send an apology if it is a meeting with your colleague. The smartphone will gather contextual information from its calendar, its sensors, the user’s location and personal data.” This initial phase for Watson is open to healthcare, retail, and travel and transportation companies. While widespread use of Watson’s APIs will take time to spread, there is much anticipation for what the technology can potentially do. As kinks are resolved and organizations become acclimated to the service, such technology could assuredly find use within consumer facing mobile apps, business applications, and websites. ![]() ![]() |
||
|
|
|