Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Submission + - Open19 Launches Open Hardware Project Targeting Edge Computing (datacenterfrontier.com)

miller60 writes: The Open19 Foundation launched today, positioning its open hardware designs as a platform for edge computing, and an alternative to the Open Compute Project and hyperscale designs. The Open19 designs were created by the data center team at LinkedIn, citing its focus on a 19-inch rack and licensing terms that it said allow participants better control over their intellectual property. Open Compute develops the 21-inch Open Rack but is also supporting several designs for 19-inch racks, including the Project Olympus concept contributed by Microsoft, LinkedIn’s parent company.

Submission + - As Crypto Mining Grows, Data Centers Begin Accepting Bitcoin (datacenterknowledge.com)

miller60 writes: Citing strong demand from cryptocurrency miners, data center and colocation providers are beginning to accept Bitcojn as payment for large chunks of data center space. It's a sign that the data center industry sees an emerging opportunity in catering to the hosting needs of crypto miners, who typically seek high-density space with cheap power. While many web hosting companies accept Bitcoin, larger data center players have been slower to embrace cryptocurrency. Utah-based C7 Data Centers says it's accepting Bitcoin because of surging demand. The Utah-based company says it now hosts about 4.5 megawatts of mining gear, just down the road from the NSA data center.

Submission + - Data Center With A Brain: Google Using Machine Learning in Server Farms (datacenterknowledge.com)

1sockchuck writes: Google has begun using machine learning and artificial intelligence to analyze the oceans of data it collects about its server farms and recommend ways to improve them. Google data center executive Joe Kava said the use of neural networks will allow Google to reach new frontiers in efficiency in its server farms, moving beyond what its engineers can see and analyze. Google's data centers aren't yet ready to drive themselves. But the new tools have been able to predict Google’s data center performance with 99.96 percent accuracy.

Submission + - New Approach to Immersion Cooling Powers HPC in a High Rise (datacenterknowledge.com)

miller60 writes: How do you cool a high-density server installation inside a high rise in Hong Kong? You dunk the servers, immersing them in fluid to create an extremely efficient HPC environment in a hot, humid location. Hong Kong's Allied Control developed its immersion cooling solution using a technique called open bath immersion (OBI), which uses 3M's Novec fluid. OBI is an example of passive two-phase cooling, which uses a boiling liquid to remove heat from a surface and then condenses the liquid for reuse, all without a pump. It's a slightly different approach to immersion cooling than the Green Revolution technique being tested by Intel and deployed at scale by energy companies. Other players in immersion cooling include Iceotope and Hardcore (now LiquidCool).

Submission + - How the Leap Second Bug Led Facebook to Build DCIM Tools (datacenterknowledge.com)

miller60 writes: On July 1, 2012 the leap second time-handling bug caused many Linux servers to get stuck in a loop. Large data centers saw power usage spike, sometimes by megawatts. The resulting "server storm” prompted Facebook to develop new software for data center infrastructure management (DCIM) to manage its infrastructure, providing real-time data on everything from the servers to the generators. The incident also offered insights into the value of flexible power design in its server farmss, which kept the status updates flowing as the company nearly maxed out its power capacity.

Submission + - U.S. Government Data Center Count Rises to 7,000

miller60 writes: The U.S. government keeps finding more data centers. Federal agencies have about 7,000 data centers, according to the latest stats from the ongoing IT consolidation process. The number started at 432 in 1999, but soon began to rise as agencies found more facilities, and exploded once the Obama administration decided to include server closets as well as dedicated data centers. The latest estimate is more than double the 3,300 facilities the government thought it had last year. The process has led to the closure of 484 data centers thus far, with another 855 planned over the next year. The GAO continues to call for the process to look beyond the number of facilities and focus on savings.

Submission + - Sears is Turning Shuttered Stores Into Data Centers (datacenterknowledge.com)

miller60 writes: Servers may soon fill the aisles where shoppers once roamed. Sears Holdings is seeking to convert former Sears and Kmart stores into Internet data hubs. Some stand-alone stores and distribution centers may be repurposed as data centers, while mall-based stores can be converted into disaster recovery sites, the company says, offering access to stores and eateries for displaced workers who may be on site for weeks. Then there's the wireless tower opportunity. Seventy percent of the U.S. population lives within 10 miles of a Sears or Kmart store, and these rooftops can be leased to fill gaps in cell coverage. It's not the first effort to convert stores into IT infrastructure, as Rackspace is headquartered in an old mall, and companies have built data centers in malls in Indiana and Maryland. But Sears, which operates 25 million square feet of real estate, hopes to make this strategy work at scale.
Supercomputing

Submission + - Titan is New Champ in Supercomputing's Top500 (top500.org)

miller60 writes: The new Top500 list of the world's most powerful supercomputers is out, and the new champion is Titan, the new and improved system that previously ruled the Top500 as Jaguar. Oak Ridge Labs' Titan knocked Livermore Labs' Sequoia system out of the top spot, with a Linpack benchmark of more than 17 petaflops a second. Check out the full list, or an illustrated guide to the top 10.
Hardware

Submission + - New York Data Centers Battle Floods, Utility Outages (datacenterknowledge.com)

miller60 writes: At least three data center buildings in lower Manhattan are struggling with power problems amid widespread flooding and utility outages caused by Hurricane Sandy. Flooded basements at two sites took out diesel fuel pumps, leaving them unable to refuel generators on higher levels. One of these was Datagram, which knocked out Buzzfeed and the Gawker network of sites. At 111 8th Avenue, some tenants lost power when Equinix briefly experienced generator problems.
Facebook

Submission + - Open Compute Hardware Adapted for Colo Centers (datacenterknowledge.com)

1sockchuck writes: Facebook has now adapted its Open Compute servers to work in leased data center space a step that could make the highly-efficient "open hardware" designs accessible to a broader range of users. The Open Compute Project was launched last year to bring standards and repeatable designs to IT infrastructure, and has been gaining traction as more hardware vendors join the effort. Facebook's move to open its designs has been a welcome departure from the historic secrecy surrounding data center design and operations. But energy-saving customizations that work in Facebook's data centers present challenges in multi-tenant facilities. To make it work, Facebook hacked a rack and gave up some energy savings by using standard 208V power.
Security

Submission + - Go Daddy: Network Issues, Not Hacks or DDoS, Caused Downtime (datacenterknowledge.com) 1

miller60 writes: GoDaddy says yesterday's downtime was caused by internal network problems that corrupted data in router tables. "The service outage was not caused by external influences,” said Scott Wagner, Go Daddy’s Interim CEO. “It was not a ‘hack’ and it was not a denial of service attack (DDoS)." The outage lasted for at least 6 hours, and affected web sites and email for customers of the huge domain registrar.
Games

Submission + - Atari turns 40 today (time.com)

harrymcc writes: "On June 27, 1972, a startup called Atari filed its papers of incorporation. A few months later, it released its first game, Pong. I celebrated the anniversary over at TIME.com by chatting with the company's indomitable founder, Nolan Bushnell, who also started Chuck E. Cheese and more than 20 other companies--mostly unsuccessful, but often visionary--and hired and influenced Steve Jobs when he was an antisocial Reed College dropout."
Space

Submission + - Hawking is First User of "Big Brain" Supercomputer (datacenterknowledge.com)

miller60 writes: Calling your product the "Big Brain Computer" is a heady claim. It helps if you have Dr. Stephen Hawking say that the product can help unlock the secrets of the universe. SGI says its UV2 can scale to 4,096 cores and 64 terabytes of memory, with a peak I/O rate of four terabytes per second and runs off-the-shelf Linux software. Hawking says the UV2 "will ensure that UK researchers remain at the forefront of fundamental and observational cosmology.”
Power

Submission + - Is a 'Net Zero' Data Center Possible? (datacenterknowledge.com)

miller60 writes: HP Labs is developing a concept for a "net zero" data center — a facility that combines on-site solar power, fresh air cooling and advanced workload scheduling to operate with no net energy from the utility grid. HP is testing its ideas in a small data center in Palo Alto with a 134kW solar array and four ProLiant servers. The proof-of-concept confronts challenges often seen in solar implementations, including the array’s modest capacity and a limited window of generation hours – namely, when the sun shines. HP's approach focuses on boosting server utilization, juggling critical and non-critical loads, and making the most of every hour of solar generation. Can this concept work at scale?

Slashdot Top Deals

Unix soit qui mal y pense [Unix to him who evil thinks?]

Working...