[TR] Facebook First Actual Cloud


臉書機房出現室內雲
原文刊登日期:June 08, 2013
原文擷取出處:The Register | Jack Clark

  Facebook's first data center ran into problems of a distinctly ironic nature when a literal cloud formed in the IT room and started to rain on servers.
  臉書的第一個資料中心碰到了一個讓運維的啼笑皆非的麻煩。

  Though Facebook has previously hinted at this via references to a "humidity event" within its first data center in Prineville, Oregon, the social network's infrastructure king Jay Parikh told The Reg on Thursday that, for a few minutes in Summer, 2011, Facebook's data center contained two clouds: one powered the social network, the other poured water on it.
  雖然臉書在俄勒岡州的資料中心也遇到過這個「濕度問題」,基建處主任 Jay Parikh 說,但在2011年夏天,有那麼幾分鐘,臉書的資料中心有兩朵雲,一朵雲計算,一朵雲澆水。

  "I got a call, 'Jay, there's a cloud in the data center'," Parikh says. "'What do you mean, outside?'. 'No, inside'."

  There was panic. "It was raining in the datacenter," he explains.

  The problem occurred because of the ambitious chiller-less air conditioning system the data center used. Unlike traditional facilities, which use electricity-intensive, direct-expansion cooling units to maintain a low, steady temperature, consumer internet giants such as Google, Facebook, and others have all been on a tear building facilities that use outside air instead.
  問題出在資料中心使用的「無冷氣製冷系統」。和一般空調費電冷氣直吹不同的是,Google,Facebook都用開放式室外冷空氣交換建築系統製冷。

  In Prineville's first summer of operation, a problem in the facility's building-management system led to high temperature and low humidity air from the hot aisles being endlessly recirculated though a water-based evaporative cooling system that sought to cool the air down – which meant that when the air came back into the cold aisle for the servers it was so wet it condensed.
  在這個資料中心渡過的第一個夏天,問題發生在建築控制系統,錯誤地把高溫低濕度的室內空氣通過一台水汽蒸發製冷機來給空氣降溫——這就意味著,冷空氣吹回到室內的時候,富含了水分。

  As Facebook rather dryly put it at the time:
  臉書乾巴巴地描述當時的情形:

  This resulted in cold aisle supply temperature exceeding 80°F and relative humidity exceeding 95%. The Open Compute servers that are deployed within the data center reacted to these extreme changes. Numerous servers were rebooted and few were automatically shut down due to power supply unit failure.
  吹回來的冷氣只有華氏80度(攝氏26度),濕度95%。沒穿泳衣的伺服器當時就感覺不對勁,好多伺服器一濕就重啟了,然後還有一些因為電線短路,就自動關機了。

  Some servers broke entirely because they had front-facing power supplies and these shorted out. For a few minutes, Parikh says, you could stand in Facebook's data center and hear the pop and fizzle of Facebook's ultra-lean servers obeying the ultra-uncompromising laws of physics.
  更有一些可憐的伺服器徹底報廢了,因為電源短路。 Jay Parikh 說:「那幾分鐘,你站在臉書的機房裡,劈裡啪啦,四周一片煙花爛漫,無數的超薄伺服器隨不可抗拒的物理原理而去。」

  Facebook learned from the mistakes, and now designs its servers with a seal around their power supply, or as Parikh calls it, "a rubber raincoat."

  "This is one of those things. When you are 100 per cent aircooled it's awesome from an efficiency perspective, but the range you have to operate in is much, much wider," Parikh says.
   Jay Parikh 稱之為:「穿雨衣」「這只是教訓之一,只關心製冷效果當然從效率上來說沒有問題,但你需要注意的事情還有很多很多。」

  The company also improved its building-management system to make sure that the error couldn't happen again. These days, Facebook's data centers are some of the most efficient bit barns in the entire cloud industry – they even sometimes beat Google's own facilities.
  臉書也改進了建築管理系統以保證類似錯誤不會重演。這些天來,臉書的資料中心跑得很不錯,有時候比Google的伺服器還要出色。

  Since then, the giant hasn't been graced with any other clouds within its cloud. But we do wish it would happen again, just so they could snap a picture.
  從此以後,再也沒有雲朵來打擾臉書的雲計算,但我們真心希望有圖有真相地再來一次啊。

原文出處 Originated from       Facebook's first data center DRENCHED by ACTUAL CLOUD • The Register

0 Comentarios