Tuesday, December 30, 2014

Chassis Plans SPAWAR NAVSSI System Barge Test

60 Lbs of HBX-1 explosive 20 feet from test barge[/caption]

MIL-S-901D – the gold standard of shock testing.


See the video at https://www.youtube.com/user/chassisplans.


Chassis Plans’ rugged military grade 4U M4U20 computer system has been successfully tested to MIL-S-901D Grade A for use in the SPAWAR NAVSSI rack. The test requires 4 shots, or explosions, in the water next to a barge in which the equipment is mounted. Thus this is commonly referred to as the Barge Test. The explosives are 60 pounds of HBX-1 placed 24 feet underwater between 20 and 40 feet from the barge.


Two testing grades are available. Grade A is the equipment must still be functional after the 4 shots. Grade B allows for the equipment to no longer function but no parts can fly off that might endanger persons or nearby equipment.


[caption id=”attachment_794″ align=”alignleft” width=”150″> M4U20 Military Grade 4U Computer System M4U20 Military Grade 4U Computer System


The Chassis Plans M4U20 computer passed the more rigorous Grade A requirement by continuing to function after all 4 shots without any repair or intervention.


The purpose of the test is to simulate Navy ships or submarines being hit with a torpedo, aerial bombardment or hitting a mine. Grade A equipment is necessary for the continued functioning of the ship after such an attack.


It might be noted the M4U system was a standard configuration and was not enhanced in any manner to prepare it for this punishing test.


Chassis Plans continues to support SPAWAR in their efforts to equip America’s war fighters with the most reliable and advanced equipment possible.


See Chassis-Plans.com for additional information.



Chassis Plans SPAWAR NAVSSI System Barge Test

Tuesday, December 23, 2014

2015 Chassis Plans Calendars are Here

Chassis Plans Chassis Plans’ 2015 Industrial Computer Calendar


Announcing the Chassis Plans’ 2015 calendar is here.  These are free to qualified users in the industrial and military computer markets.  Go to www.chassis-plans.com/industrial-computer-calendar.html to request your own copy.


 


This is a beautiful large format 11 x 17-inch work of art that everybody loves.



2015 Chassis Plans Calendars are Here

Monday, December 15, 2014

Big Data: Interesting Facts and Figures

There is a lot of talk in the media about “Big Data.”  Here are some fun facts on the subject:


  • It took from the dawn of civilization to the year 2003 for the world to generate 1.8 zettabytes (10 to the 12th gigabytes) of data. In 2011 it took two days on average to generate the same amount of data.

  • In the year 2011 there were 12 million RFID tags sold worldwide.  That number is projected to be 209 billion by 2021.

  • There are 750 million photos uploaded to Facebook every two days

  • 1/3 of all data will be stored in or pass through the cloud by the year 2020 and will amount to 35 zettabytes of combined data

  • There are almost as may bits of information in the digital universe as there are stars in our real universe.

  • There are over 247 billion e-mail messages sent each day.  Up to 80% of them are spam.

  • 48 hours of video are uploaded to YouTube every minute, resulting in 8 years’ worth of digital content each day

  • The world’s data doubles every two years

  • Oil drilling platforms have 20,000 to 40,000 sensors

  • Twitter processes 7 terabytes of data every day

  • The number of text messages sent and received every day exceeds the total population of the planet

  • Facebook processes 10 terabytes of data every day

  • Decoding the human genome took 10 years to process; now it can be accomplished in one week

  • 571 new websites are created every minute of the day

  • U.S. drone aircraft sent back 24 years with of video footage in 2009

  • Google has over 3 million servers processing over 1.7 trillion searches per year in 2011

 



Big Data: Interesting Facts and Figures

Wednesday, December 3, 2014

Cloud Computing and Big Data - Two Peas in a Pod?

There have been a multitude of blogs and articles about Cloud Computing and Big Data but usually as individual topics. As the use of cloud computing increases, the processing of Big Data becomes more prominent. Why is that?


 


Figure 1 Ruggedized Server Cloud-In-A-Case System Figure 1
Ruggedized Server Cloud-In-A-Case System


Cloud computing is the latest buzzword for the future of computing. But it is really not a new term. The first acknowledged use of “cloud computing” has been traced to 1996 and Compaq Computer. Their vision was detailed and timely. Not only would all business software move to the web, but what they termed “cloud computing-enabled applications” like consumer file storage would become common (Technology Review 10/11). Of course in 1996 network technology and computer technology was not at the point of being able to implement a lot of the ideas of cloud computing.


 


Fast forward ten years and Amazon.com introduced the Elastic Compute Cloud (EC2), three words that describe expandable computing power located in some other space. In 2006 Amazon released a beta version of EC2 to the public on a first come, first served basis. EC2 went into production in October 2008 when the beta label was removed and EC2 was offered as a supported service by Amazon. Now Amazon cloud and web services are used by not only commercial enterprises but also by government and military users.


 


Said Jeff Bezos at an Amazon shareholder meeting “We’re really focused on what we call infrastructure Web services…Amazon Web Services is focused is on very deep infrastructure. It has the potential to be as big as our retail business. It’s a very large area and right now (and) it’s done in our opinion in a very inefficient way. Whenever something big is done inefficiently that creates an opportunity.”


 


Big data is another term that has a long history. There are several references to the term Big Data but the one that is most referenced is from John Mashey in the mid 1990’s when he was Chief Scientist at Silicon Graphics. An example of his use of the term ‘Big Data’ is available in the public domain from a presentation given at Usenix in 1999 – “Big Data and the Next Wave of Infrastress“.  Mr. Mashey also seems to have developed another word that is appropriate for current times – Infrastress or, Stress on the Infrastructure.


 


The uses of big data are everywhere with such diverse examples as the Library of Congress storing “tweets” for future review and study. As of March 2013 the library has stored 170 billion tweets and was adding 150 million a day. Another example is the military capturing real time video data of an area of interest. One sensor system the military uses is Gorgon Stare – a spherical array of nine cameras attached to an aerial drone. Used as a wide-area surveillance and sensor system Gorgon Stare can generate several terabytes of data per minute. Mounted on a drone with 24 hour loiter capability, the amount of data available is staggering.


 


In order to analyze and present intelligent results to a user, the amount of network bandwidth and computer power would be beyond what is available to the average user. The concept of virtualization has provided a way to integrate the compute part of cloud computing with the storage of massive amounts of data. By locating the compute resource in the cloud, the user is provided with access to a virtual computer that can provide the computing power sized to run the desired application. Chassis Plans wrote an article in COTS Journal about ruggedized virtual access for military applications – “Ruggedized Servers for the Data Centric Military Environment”. In the article, the author describes the type of server needed to access the cloud and big data.


 


Today’s limitation of network access for remote users and limits on local computing resources make the use of the cloud an appropriate technology. By co-locating the data and compute resource in the same area, the best use can be made of the infrastructure and the user can have access to interpreted data on devices such as tablets and smart phones. In other words, without the cloud the use of big data is very limited and without the cloud to store big data data mining applications would not be available. Two peas in a pod.


 


What are the next hurdles? There are several challenges facing users today. The first is developing the applications necessary to process the data. The data is now available, but the algorithms to process the data are not. Beyond the applications the question of security is a major issue. With the latest hacks of both government and private data bases, the issue of encryption and cyber defense are fast becoming strategic to future technology advances and expansion of cloud computing.


 


The big news last week was the hack of Sony Pictures which caused massive damage to their email services and the release of several un-released movies.  Cyber defense is not just a concern for the government and military.



Cloud Computing and Big Data - Two Peas in a Pod?