Dismiss Notice
Welcome to IDF- Indian Defence Forum , register for free to join this friendly community of defence enthusiastic from around the world. Make your opinion heard and appreciated.

IBM Ups Its Quantum Computing Game

Discussion in 'The Americas' started by BMD, Nov 14, 2017.

  1. BMD

    BMD Colonel ELITE MEMBER

    Joined:
    Nov 20, 2012
    Messages:
    10,706
    Likes Received:
    2,996
    Country Flag:
    United Kingdom
    https://www.top500.org/news/ibm-ups-its-quantum-computing-game/

    IBM Ups Its Quantum Computing Game

    Michael Feldman | November 12, 2017 23:20 CET
    E-mail
    Tweet
    Like
    +1
    Share
    IBM has revealed it has a working prototype of 50-qubit quantum computer, along with a 20-qubit system that will be made available to clients before the end of the year.

    [​IMG]IBM has made 5-qubit and 16-qubit systems freely available to users via its IBM Q cloud, an environment that has enabled researchers access to the company’s quantum computing hardware. The environment also includes access to QISKit, an open-source developer kit that provides a high-level interface to the IBM hardware via Python. The company claims over 60,000 users have run more than 1.7 million quantum experiments using IBM Q since it came online last year.

    The 20-qubit system will be available via this same environment sometime between now and the end of 2017. But IBM says this latest hardware will be the first IBM Q system for clients, implying that the 20-qubit technology will be the basis for the company’s initial commercial offerings.

    The step up to 20 qubits is a reflection of the rapid progress IBM has made in improving the hardware. The 5-qubit system launched in May 2016, followed a year later by the 15-qubit machine. Now, six months later, IBM has come up with a 20-qubit platform, which not only offers more qubits, but has twice the coherence times – an average of 90 us – as its less powerful predecessors. That means users will have more time to perform quantum calculations before the system becomes unusable. Although this seem like a small window of opportunity for computation, it represents significant progress toward a fully fault-tolerant universal quantum computer.

    The press release didn’t go into much technical detail about how IBM engineers were able to simultaneously expand the number of qubits and increase the coherence times, only stating that the improvements were the result of better “superconducting qubit design, connectivity and packaging.” Details about the 50-qubit prototype, known as 50Q, were even harder to come by, with the press release only saying the processor is a “natural extension” of the 20-qubit technology and exhibits “similar performance metrics.”

    At one point, the 50-qubit milestone was considered the threshold of “quantum supremacy” – the point at which a quantum device would be able to outperform a classical computer simulating a quantum computing task. Last month though, researchers from IBM and elsewhere were able to demonstrate that a 56-qubit simulation could be run on a classical system by employing some clever algorithmic magic, pushing the quantum supremacy threshold a bit further into the future.

    Nevertheless, a 50-qubit chip would have a lot more capability than a 20-qubit chip, and would enable developers to significantly expand the scope of applications that could be run on these machines. IBM did not commit to a timeline for when a 50-qubit system would be available to users, but given IBM’s plans to provide a series of upgrades to the IBM Q environment in 2018 and its rapid pace of hardware improvements, a production machine may not be too far off.

    At this point, IBM’s nearest quantum competition is Google, which intends to demonstrate a working 49-qubit system before the end of 2017. However, Google has kept its quantum computing machinery behind closed doors, so it’s going to be difficult to compare the technologies until the web giant allows third parties to access their systems.

    @RMFAN
     
    Darth Marr likes this.
  2. BMD

    BMD Colonel ELITE MEMBER

    Joined:
    Nov 20, 2012
    Messages:
    10,706
    Likes Received:
    2,996
    Country Flag:
    United Kingdom
    https://gizmodo.com/the-quantum-d-wave-2-is-3-600-times-faster-than-a-super-1532199369

    The Quantum D-Wave 2 Is 3,600 Times Faster than a Super Computer
    [​IMG]
    Andrew Tarantola

    3/04/14 11:40am
    Filed to: MONSTER MACHINES
    78.8K
    1327
    [​IMG]

    Quantum computing is being hailed as the future of data processing, with promises of performing calculations thousands of times faster than modern supercomputers while consuming magnitudes less electricity. And in the span of just two years the only commercially available quantum computer, the D-Wave One, has already doubled its computational power. Kiss your law goodbye, Mr. Moore.


    [​IMG]
    D-Wave One Handles All Your Quantum Computing Needs (for $10 Million)
    There's been a lot of talk lately about how close we are to quantum computing for the masses.…

    Read more
    Quantum computing differs from classical computing at its most fundamental level. While traditional computers rely on the alternate bit states of 1 and 0 to store data, quantum computers exploit the fuzzy effects of quantum mechanics, allowing its "qubits" to exist as a 1, 0, or both simultaneously, a.k.a "superposition." So while a traditional computer will sequentially explore the potential solutions to a mathematical optimization problem, the quantum system looks at every potential solution simultaneously, known as quantum annealing, and returns answers—not just the single "best" but nearly 10,000 close alternatives as well—in roughly a second. What's more, unlike traditional computers which rely on logic gates to manipulate bits, the D-Wave system uses an adiabatic, which reads out the ground state of its qubits to find a solution.

    [​IMG]
    What's Wrong With Quantum Computing
    You've heard plenty of people by now—including us—banging on about quantum computers, and how…

    Read more
    View attachment upload_2017-11-14_16-9-35.gif
    The 128-qubit array of the DW1

    When the original D-Wave One (DW1) debuted in May, 2011, it utilized a 128 qubit chip-set, magnitudes faster than existing supercomputer technology, and was immediately purchased by research labs and defense contractors such as Lockheed, which installed one at USC in LA. However the recently released D-Wave Two (DW2), blows its predecessor out of the water using a massive 512 qubit array.

    View attachment upload_2017-11-14_16-9-35.gif
    Each qubit is a small, superconducting processor that exploits quantum mechanical effects. These effects are magnified the more qubits that are connected to one another. Were every one of the 509 functional qubits in the D-Wave Two hooked up to one another, the system would wield processing powers 100 orders of magnitude more powerful its predecessor. However, since each DW2 qubit only communicates directly seven other qubits in tightly packed blocks or nodes—these nodes are then connected to other 8-qubit nodes in the lattice—the DW2 is only about 300,000 times more powerful than the D-Wave One. So, you know, just a modest spec bump.



    Heck, even when the DW2 is only using 439 qubits, as it did when D-Wave Systems consultant, Catherine McGeoch, tested it against state of the art workstations running top of the line CPLEX optimization algorithms, the DW2 produced results 3,600 times faster—providing 100 solutions in a half second. The workstations required a half hour to produce the same.


    But in order to take full advantage quantum effects, the DW2 requires very specific, extreme conditions. For one, it operates at 0.02 Kelvin—150 times colder than the depths of interstellar space and just two degrees above absolute zero—in a vacuum 10 billion times lower than standard atmospheric pressure and experiences 50,000 times less magnetic interference thanks to its heavy shielding. Surprisingly, achieving these temperatures consumes just 15.5kW and takes up just ten square meters of floor space, compared to the thousands of kilowatts and warehouses of space that traditional super computers require.

    View attachment upload_2017-11-14_16-9-35.gif
    A qubit - Ndickson at en.wikipedia

    Google, NASA, and the Universities Space Research Association all kicked in on a DW2 in May of last year—D-Wave Systems won't quote prices though the BBC estimates its cost at about $15 million—for machine learning research (which would help explain why Google's been snapping up every AI designer and robotics company it could get its hands on over the last six months). Quantum computing still has a way to go before it's reliable, but between recent breakthroughs and kickass hardware, it shouldn't be long before it's ready to catapult us into the future. [Wiki 1, 2- Extremetech - BBC - D-Wave Systems - Nature]
     
    Darth Marr likes this.
  3. BMD

    BMD Colonel ELITE MEMBER

    Joined:
    Nov 20, 2012
    Messages:
    10,706
    Likes Received:
    2,996
    Country Flag:
    United Kingdom
    https://phys.org/news/2017-07-record-breaking-qubit-quantum-simulation-nersc.html

    Record-breaking 45-qubit quantum computing simulation run at NERSC
    July 3, 2017
    [​IMG]
    A multi-qubit chip developed in the Quantum Nanoelectronics Laboratory at Lawrence Berkeley National Laboratory.

    When two researchers from the Swiss Federal Institute of Technology (ETH Zurich) announced in April that they had successfully simulated a 45-qubit quantum circuit, the science community took notice: it was the largest ever simulation of a quantum computer, and another step closer to simulating "quantum supremacy"—the point at which quantum computers become more powerful than ordinary computers.

    The computations were performed at the National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science User Facility at the U.S. Department of Energy's Lawrence Berkeley National Laboratory. Researchers Thomas Häner and Damien Steiger, both Ph.D. students at ETH, used 8,192 of 9,688 Intel Xeon Phi processors on NERSC's newest supercomputer, Cori, to support this simulation, the largest in a series they ran at NERSC for the project.

    "Quantum computing" has been the subject of dedicated research for decades, and with good reason: quantumcomputers have the potential to break common cryptography techniques and simulate quantum systems in a fraction of the time it would take on current "classical" computers. They do this by leveraging the quantum states of particles to store information in qubits (quantum bits), a unit of quantum information akin to a regular bit in classical computing. Better yet, qubits have a secret power: they can perform more than one calculation at a time. One qubit can perform two calculations in a quantum superposition, two can perform four, three eight, and so forth, with a corresponding exponential increase in quantum parallelism. Yet harnessing this quantum parallelism is difficult, as observing the quantum state causes the system to collapse to just one answer.

    So how close are we to realizing a true working prototype? It is generally thought that a quantum computerdeploying 49 qubits—a unit of quantum information—will be able to match the computing power of today's most powerful supercomputers. Toward this end, Häner and Steiger's simulations will aid in benchmarking and calibrating near-term quantum computers by carrying out quantum supremacy experiments with these early devices and comparing them to their simulation results. In the mean time, we are seeing a surge in investments in quantum computing technology from the likes of Google, IBM and other leading tech companies—even Volkswagen—which could dramatically accelerate the development process.


    Simulation and Emulation of Quantum Computers

    Both emulation and simulation are important for calibrating, validating and benchmarking emerging quantum computing hardware and architectures. In a paper presented at SC16, Häner and Steiger wrote: "While large-scale quantum computers are not yet available, their performance can be inferred using quantum compilation frameworks and estimates of potential hardware specifications. However, without testing and debugging quantum programs on small scale problems, their correctness cannot be taken for granted. Simulators and emulators … are essential to address this need."

    That paper discussed emulating quantum circuits—a common representation of quantum programs—while the 45-qubit paper focuses on simulating quantum circuits. Emulation is only possible for certain types of quantum subroutines, while the simulation of quantum circuits is a general method that also allows the inclusion of the effects of noise. Such simulations can be very challenging even on today's fastest supercomputers, Häner and Steiger explained. For the 45-qubit simulation, for example, they used most of the available memory on each of the 8,192 nodes. "This increases the probability of node failure significantly, and we could not expect to run on the full system for more than an hour without failure," they said. "We thus had to reduce time-to-solution at all scales (node-level as well as cluster-level) to achieve this simulation."

    Optimizing the quantum circuit simulator was key. Häner and Steiger employed automatic code generation, optimized the compute kernels and applied a scheduling algorithm to the quantum supremacy circuits, thus reducing the required node-to-node communication. During the optimization process they worked with NERSC staff and used Berkeley Lab's Roofline Model to identify potential areas where performance could be boosted.

    In addition to the 45-qubit simulation, which used 0.5 petabytes of memory on Cori and achieved a performance of 0.428 petaflops, they also simulated 30-, 36- and 42-qubit quantum circuits. When they compared the results with simulations of 30- and 36-qubit circuits run on NERSC's Edison system, they found that the Edison simulations also ran faster.

    "Our optimizations improved the performance – the number of floating-point operations per time – by 10x for Edison and between 10x and 20x for Cori (depending on the circuit to simulate and the size per node)," Häner and Steiger said. "The time-to-solution decreased by over 12x when compared to the times of a similar simulation reported in a recent paper on quantum supremacy by Boixo and collaborators, which made the 45-qubit simulation possible."

    Looking ahead, the duo is interested in performing more quantum circuit simulations at NERSC to determine the performance of near-term quantum computers solving quantum chemistry problems. They are also hoping to use solid-state drives to store larger wave functions and thus try to simulate even more qubits.



    Read more at: https://phys.org/news/2017-07-record-breaking-qubit-quantum-simulation-nersc.html#jCp
     
    Darth Marr likes this.
  4. BMD

    BMD Colonel ELITE MEMBER

    Joined:
    Nov 20, 2012
    Messages:
    10,706
    Likes Received:
    2,996
    Country Flag:
    United Kingdom
    https://www.dailyo.in/technology/ibm-quantum-computers-google-microsoft/story/1/20536.html

    IBM announces the world's fastest quantum computer - what does it mean?
    They have the potential to change the world.
    SCI-TECH
    | 5-minute read | 11-11-2017
    [​IMG]
    SUSHANT TALWAR

    @sushanttalwar

    • 49
      Total Shares
    The Wall Street Journalin an article, quantum computers are so efficient in their job that "the computing power of a data center stretching several city blocks could theoretically be achieved by a quantum chip the size of the period at the end of this sentence".

    [​IMG]

    So should we be excited?

    So quantum computers are great news. Even Microsoft's CEO Satya Nadella recently said so at India Today Conclave Next 2017, when he spoke about how quantum computers could change the world as we know it. But are they?

    At the conclave, Nadella spoke of the challenges that remain for technology, many of which he explained will be tackled by quantum computing.

    "We still cannot model enzymes in food production that will be crucial to solving the world's food needs... For the last 10 years, Microsoft has been working towards quantum computing that will help find solutions to such problems currently not possible to solve using brute force methods of modern day technology... Quantum computing is the future of technology that will change the way the world works."

    Well, the answer to this, like quantum computing, is zero and one, true and false, all at the same time.

    As Nadella, and several others have pointed out, quantum computers could theoretically be the answer to some of the major problems that could fundamentally change the world around us. But that is it. For now, much of what is being said about the benefits of quantum computing remains in the realm of theory. There is potential, but not tangible proof of its powers.

    [​IMG]

    Also, it is important to understand that these processors are extremely difficult to make and even more difficult to operate, thus ensuring that they would not become mass use products anytime soon.

    Quantum computers need specific algorithms tailored for solving specific problems and excel in certain tasks, like factoring numbers and modelling molecules which require an insane amount of computational power, but they are useless for daily computational needs, hence barring some major technological breakthrough, they will not anytime soon make their way into our daily lives.

    Plus, quantum computers, as opposed to classical ones, are extremely error-prone.

    For now, we are still awaiting the big breakthrough in science or medicine facilitated by the computational power of a quantum computer. We could stumble upon one in the next few years, or have to wait for 10 or even 20 for something major. Hopefully, this 50 qubit quantum computer by IBM will hasten the progress. For now, we can only hope.
     
    Darth Marr likes this.
  5. Darth Marr

    Darth Marr Captain FULL MEMBER

    Joined:
    Oct 19, 2016
    Messages:
    1,363
    Likes Received:
    2,301
    Country Flag:
    India
    Isn't NASA doing something with Quantum computing and Artificial Intelligence ? Can you imagine a Self learning , Self aware AI with a Quantum computing power O_O
     
    BMD likes this.
  6. BMD

    BMD Colonel ELITE MEMBER

    Joined:
    Nov 20, 2012
    Messages:
    10,706
    Likes Received:
    2,996
    Country Flag:
    United Kingdom
    https://www.top500.org/news/oak-ridge-readies-summit-supercomputer-for-2018-debut/

    Oak Ridge Readies Summit Supercomputer for 2018 Debut
    Michael Feldman | November 15, 2017 02:11 CET
    E-mail
    Tweet
    Like
    +1
    Share
    1
    Summit, the most powerful supercomputer in the United States, is currently under construction at the US Department of Energy’s Oak Ridge National Lab (ORNL). ORNL director Thomas Zacharia updates us on its status and reveals the opportunities the new machine will provide scientists when it comes online next summer.



    [​IMG]



    Zacharia, who was named lab director in June, take the reins at ORNL just as Summit is poised to become the number one system in the United States, and with any luck, the fastest supercomputer in the world. The current champ, by Linpack standards, is the Sunway TaihuLight, a 93-petaflop supercomputer installed at the National Supercomputing Center in Wuxi, China. Summit is slated to hit at least 200 peak petaflops when completed next summer, and if China or anyone else can’t double up on the Sunway machine’s flops in the interim, the US will recapture the number one spot on the next TOP500 list. If the happens, it will be the first time the US has occupied the top position since 2012.

    Currently the most powerful supercomputer at Oak Ridge is Titan, a five-year-old machine, which at 27 peak petaflops, is the fastest in the US, and ranked fifth in the world. It’s powered by AMD Opteron CPUs and NVIDIA K20x GPUs. When it was purchased, GPU accelerators were pretty much a new thing for top-flight HPC machinery, and Oak Ridge established itself as something of a pioneer in developing supercomputing applications for this CPU-GPU architecture. With Summit they will continue the heterogenous tradition, but this time with IBM’s upcoming Power9 processors and NVIDIA’s latest GPU accelerator, the V100.

    With more than 25,000 V100 GPUs, Summit will not only top 200 peak petaflops in double precision math (64-bit floating point) for traditional HPC simulations, but will also deliver more than 3 exaflops for the 32-bit/16-bit mixed arithmetic employed by machine learning codes. Oak Ridge is well aware of the opportunity this presents, and according to Zacharia, plans to leverage the V100 extensively for this new application category.

    “Many of our users are already developing sophisticated machine learning algorithms that, when combined with traditional modeling and simulation, will give us entirely new capabilities,” Zacharia told TOP500 News. “One team from Oak Ridge is working on a machine learning algorithm for Summit to help select the best treatment for cancer in a given patient. We have a team using Titan today to develop the deep learning tools to design and monitor fusion reactors. Another team is using machine learning to help classify types of neutrino tracks seen in experiments.”

    The melding of HPC simulations with machine learning is emerging as one of the more important developments in supercomputing over the next decade, and with Summit, Oak Ridge is positioned to be a leading center for some of the most ambitious work in this area. Of course, the system will also be tackling more traditional supercomputing applications and Zacharia points to codes like ACME (climate modeling), DIRAC (relativistic quantum chemistry, FLASH (astrophysics), NWCHEM (computational chemistry), GTC (plasma physics), and NAMD (biophysics), which are expected to get a huge performance boost from Summit’s power.

    These applications, and others, will be ported to the Power9/V100 platform as part of an effort known as the Center for Accelerated Application Readiness (CAAR) program. It will involve “redesigning, porting, and optimizing application codes for Summit’s hybrid CPU–GPU architecture.” The development teams for these applications will get early access to Summit and receive support from the new IBM/NVIDIA Center of Excellence at ORNL.

    The installation of Summit appears to be proceeding at a steady pace. All of the cabinets are installed, and most of the interconnects are now wired. NVIDIA has been shipping the V100 GPUs for awhile now, so there shouldn’t be any holdup on the accelerator side. And even though IBM hasn’t officially launched the Power9 processors, the company expects they will be in production before the end of the year.

    But this is the first time this Power9/V100 platform will be assembled, so it’s going to be a somewhat protected installation. According to Buddy Bland, project director at the Oak Ridge Leadership Computing Facility, they expect the compute nodes to start arriving in late fall, but he doesn’t think all of the hardware will be onsite until February or March of next year, with formal acceptance scheduled for late summer, 2018. Though the timeline seems unusually long, Bland maintains that this is pretty much in line with other large system installations.

    “It takes a long time to install and test 4600 nodes,” explained Bland. “Once the hardware can get through the diagnostic tests, then we have to start testing the system software and scaling that out to the size of the system. Again, there isn’t any other place to test this, so there will be several months of testing and debugging the software before the system will be ready to run the acceptance test.”

    When all of the dust settles, ORNL and its users will have at their disposal a truly unique machine, which Zacharia believes will advance scientific discovery for years to come. “Oak Ridge National Laboratory today is in a tremendously enviable position,” he noted. “We have signature strengths in materials, in neutrons, in computing, nuclear science and engineering. And we can apply that and the tremendous talent that we have to solve challenging problems in energy, in national security, in manufacturing, and in grid cyber security. So our opportunity is to put it all together and challenge ourselves to take the next step and be that premiere research institution in the world. That, in and of itself, is exciting.”
     
    Darth Marr likes this.
  7. BMD

    BMD Colonel ELITE MEMBER

    Joined:
    Nov 20, 2012
    Messages:
    10,706
    Likes Received:
    2,996
    Country Flag:
    United Kingdom
    https://www.top500.org/news/intel-dumps-knights-hill-future-of-xeon-phi-product-line-uncertain/

    Intel Dumps Knights Hill, Future of Xeon Phi Product Line Uncertain
    Michael Feldman | November 15, 2017 04:34 CET
    E-mail
    Tweet
    Like
    +1
    Share
    2
    While vendors are busy announcing new HPC offerings at this week’s Supercomputing Conference (SC17), Intel announced it is removing its next-generation “Knights Hill” Xeon Phi product from its roadmap. And that might just be the beginning.

    [​IMG]In a blog penned by Intel’s Data Center Group GM Trish Damkroger describing the company’s exascale strategy and other topics they are talking about at the SC17 conference, she offhandedly mentioned that the Knights Hill product is dead. More specifically she said that the chip will be replaced in favor of “a new platform and new microarchitecture specifically designed for exascale.”

    Knights Hill, you might remember, was going to be the chip that powered Aurora, a DOE pre-exascale machine that was to be deployed at Argonne National Laboratory in 2018. But last month it was revealed that the Aurora contract had been rewritten, increasing the original performance target of 180 petaflops to over one exaflop
     

Share This Page