Government Online Services from CENTRAL, STATES and UNION Territories

AbbudA! India

Discover and Use from 11,000+ online Government Services

































Copyright © Abbuda | Powered by Blogger
Design by WPMthemes.com | Blogger Theme by NewBloggerThemes.com

Custom Search for India

Cloud Computing: The Real Meaning


During my teaching class and in workshop presentation, audience often ask what is the real difference of cloud computing and “the internet” or web service” “ isn’t the internet = cloud computing”. The answer is that cloud computing does not equal the internet or web or vice-versa, the internet is simply the best delivery platform that cloud computing is making use of.  It is possible to have a cloud computing infrastructure totally isolated from the internet and it is called a private cloud network. Still, some private cloud networks are made to be accessible via the internet,( like VPN of earlier days)  but the argument still stands that cloud computing does not automatically mean or require the internet. What it means that a cloud computing infrastructure can be built within an enterprise and without going to the Internet


What makes cloud computing itself is not the underlying combination hardware technology or the infrastructure, for these technologies have been in existence for quiet long time.  The fact that makes it cloud computing is the way that services and functions are being composed, handled and delivered. In essence what cloud computing means is that everything is being done by a server that is located somewhere that “abstracted” that end user practically need not concern herself.  . The server is somewhere out there, “in the clouds”.
Underlying all the services and applications that are labeled as “cloud services” - are servers, number of them  where all the processing elements are, the CPU, RAM, GPU, and even the storage devices- located in “data centers” . However, a server is only really able to dedicate itself to one task, application, or function and serve that over a local network or typically via the internet. This is very limiting for cloud computing as you would need a lot of servers for multiple applications, taking up space, producing heat, and soaking up electricity like sponges. The answer to this is the whole point of the that brings in the one and only difference: Virtualization: in this case server virtualization.
The challenge is that a server can only serve one purpose so you would need more of them. So as I mentioned, the answer is server virtualization, the creation of multiple “virtual” servers via software and using only one or few real server hardware to do it. A powerful server can create hundreds of virtual servers with each having its own hardware specification like CPU speed, RAM size, and storage capacity. And each one of these virtual servers can be put to a single use, providing they do not all require vast amounts of computing power. This means that we are able to host multiple applications and services on a single or small group of physical servers, a very efficient use of hardware resource.
Working of  Virtualization is fundamental to  cloud computing and as a result brings another benefit that cloud computing is famous for, scalability. Because each virtual server is allocated only enough computing power and storage capacity that the client needs, more virtual servers can be created. But if the needs grow, more power and capacity can be allocated to that server, or reduced if needed. And because clients only pay for how much computing power and capacity they are using, this can be very affordable for most clients. Without virtualization, cloud computing as we know it would not exist or would be in a different form. Virtualization made cloud computing dream come true.  But such is now only in the realm of speculation as virtualization is really here to make Information Technology more affordable for the world. The article is inspired by this

Myanmar Moves to the Cloud: Builds first Cloud Environment


Daiwa Institute of Research Ltd. (DIR), Fujitsu Limited, and KDDI Corporation today announced that they have collaborated to build the Republic of the Union of Myanmar’s first cloud computing environment. Built for the Central Bank of Myanmar, the new cloud environment is designed to improve efficiency in the bank’s operations. It consists of a private cloud platform designed, constructed, and operated in compliance with the Alliance Cloud, a standardized cloud model certified by the DIR-led Global Alliance for User-driven Cloud Computing, as well as a desktop service that features security countermeasures.


In advance of the fast-approaching economic integration of ASEAN nations scheduled for 2015, Myanmar, now rapidly implementing democratic reforms, has been actively seeking to modernize its financial sector by relaxing financial regulations, making preparations to establish a stock exchange [1] and taking other initiatives. Under these circumstances, operating stability at the Central Bank of Myanmar is ever-more crucial to the country’s financial system given its pivotal role in issuing and managing currency and implementing monetary policy.

Cloud computing making rapid advances

The technology of cloud computing is really a significant leap in terms of the optimal use of resources, whether infrastructure or software platforms or for commercial activities. With high quality connectivity between sources and end-users of information and resources, it is likely for cloud computing services to contribute to the deployment of advanced automation systems for all kind of companies at with cost effective prices."


"The technical world today is running at an accelerated pace, each day witnessing significant shifts and developments created by the people's need to communicate in social media. Facebook, for example, recorded millions of users in a few years, this large turnout came from the desire inherent in humans to communicate, which wouldn't have succeeded without those needs."

Online education and cloud computing collide in T.O. startup

Every once in a while, industries collide and create immense opportunity. It happened a few years ago with smartphones, when increasingly more powerful computing was enhanced by the explosive growth of the wireless industry.

Desire2Learn Inc. says, is staring into a “perfect storm” of two industries converging: cloud computing and the enormous technological revolution taking place in the education sector. “We have a really hot market, at a time when it’s going to explode,” 

D2L has become one of the largest global providers of advanced software and mobile applications for the education industry. The Kitchener, Ontario-based company reaches about eight million people around the world every day with programs that allow students to submit assignments online, mobile applications that enable them to stream lectures, and tools that make it possible for teachers to return assignments with audio criticisms attached to the electronic file. This feedback from teachers is often faster and more useful for the student than a written comment. Here is the complete story 

How Cloud computing helps in linking learner’s life at school to their life outside the school?

Practical Education
Educators all over the world recommend that learner’s life at school must be linked to their life outside the school. In case of professional courses such as engineering or management, the outside life would invariably mean their professional, industrial or entrepreneurial or civilian life.


Cloud computing and social media turn out to be a very resourceful in linking learner in the school to the “life outside the school”.  Here are some possible scenarios.
  • At first level, simple steps such as having account in social media sites and using resources such as Wikipedia provide an initial experience of the environment where most of  institutions and companies participating in various activities to different degrees. For example a Facebook Page of a university where learner is planning her higher education or a marketing campaign of a well known company.
  • At second level- cloud tools and using them helps learners to increase their productivity, participation and visibility in the connected world.    It also make them “ready to use” resource in the industry. For instance practical knowledge of features such as collaboration, posting on blogs, creating and becoming member of online groups and more.
  • At the third level, industry-strength systems such as GAE or AWS help student to develop projects and master many features and make them “ready to deploy” workforce.
  • At fourth level, a general awareness of cloud computing phenomenon and how institutions such a governments are embracing it, makes learner to prepare well for civilian life where her interaction with such organization may happen at cloud interface such as Adhaar initiative.

Cloud Economics: The Business Side of Cloud Computing

By 2020, Cloud is expected to be a $240B industry. It’s no wonder then that businesses are trying to figure out how to harness this power to achieve agility through IT Transformation.  In this “Cloud Economics” infographic, VMware outlines the types of clouds being deployed, adoption rate of leading countries, and forecasts on what virtualization means for jobs and market competition in the years to come. Here a detailed report from VmWare



Social media, cloud computing hold promise for Indian IT in 2013


Here is an interesting peep into 2013 by FirstPost
IT companies seem to be ready for this new “normal” and are now embracing technologies like social media, cloud, analytics and mobility (SCAM) to optimise and ensure efficiency in business environment, all within flat or lower than usual IT budgets.
While the global macroeconomic scenario remains uncertain in the coming years, the industry will continue to exhibit resilience and adaptability in continually reinventing itself to retain its appeal to clients, Nasscom said.
                                                        Social Media and cloud computing 
“The year 2012 has been a landmark year for the Indian IT industry… At such a large base, we expect the industry to clock double digit growth in FY 2013 which exhibits that despite global uncertainties, IT-BPM industry has moved from efficiency to effectiveness,” it added in an emailed response.
2012 was a mixed one for the top five IT services firms. While Tata Consultancy Services (TCS), HCL Technologies and Cognizant saw good growth, Infosys and Wipro lagged peers. TCS and HCL Technologies have registered quarterly revenue growth 13 percent and 17 percent in dollar terms, while both Infosys and Wipro registered under 5 percent growth.

Cloud Computing @ Wimbledon

This video discusses the benefits Wimbledon Club realized from utilizing a cloud infrastructure to meet the viewing demands of their global audience. Learn more about how leveraging the cloud allowed them to adjust system workloads over the course of the event to optimize IT efficiency  
 .

Cloud computing: Only 5% techies are job-ready

Here is an interesting report from Times of India and an opportunity in the domain of cloud computing in the days to come 

While cloud computing is widely recognized as the next big opportunity to watch out for, it has already made significant inroads in the industry. However, the IT workforce may not keep pace with the developments. The industry ready workforce for new domains like cloud and mobility may be as less as 5.7% of the current IT workforce , said senior corporate vice president, Rajiv Sodhi,HCL Technologies,.



An internal HCL report on IT careers said that the traditional "IT industry career is in decline, threatening the careers of 10 million employees" . The global IT workforce is estimated to be around 20 million out of which around 3 million are in India.
The Indian cloud market may grow by more than 70% in 2012 as per the Indian Cloud Market Overview 2011-2016 report by International Data Corporation (IDC). It stood at $535 million in 2011. The report also said that the market is likely to grow 50% over the next three years.
While it is not that IT jobs are disappearing, there is a clear shift in the nature of the jobs. While traditional IT jobs are likely to get automated, jobs in newer domains like cloud computing will be created.
The HCL report further says that the churn of experienced employees from IT to other sectors has gone up by 15%-20 % over the past year. "Training existing employees to deal with newer technologies is the dominant trend. Only 30% of the workforce which is already skilled, is being hired from outside organisations ," said AVP and Head - Cross Functional Services, Kalyan Kumar, HCL Tech.  For complete report read here 

The Future of Mobile Cloud Computing


Use of cloud services at home, in the workplace and in large enterprises has steadily and significantly increased. Now we are seeing a similar trend with mobile devices and cloud technology. Mobile devices are already accessing a number of cloud services, such as Dropbox, and more third-party applications utilize cloud computing technology. It is only a matter of time until the technology becomes the central force to mobile applications. Where will the future of the mobile cloud lie?

Mobile platforms are already accessing the cloud for a lot of consumer-based services such as email, social media, online file storage and corporate communications tools. But so far, there are essentially only two players here ñ the individual or consumer, and the consumer cloud service,” said Dan Matthews, chief technology officer with IFS North America. “One of the biggest changes I think we will see in the next year or two is the entry of a third player ñ the corporate back-end system (e.g., corporate ERP, Financials, SCM and other mission-critical systems).”  Here is a complete post in CloudTimes

Research as a Service: Cloud Computing Accelerates Scientific Discovery


 Towards Research as a Service”? an interesting post by Microsoft 
Over the past two years, we have seen growing interest from the scientific community in using public clouds for research. As part of the original Cloud Research Engagement Initiative in 2010, Microsoft partnered with funding agencies all over the world to award more than 75 research teams for projects using Microsoft’s Windows Azure cloud. The research covers topics in computer science, biology, physics, chemistry, social science, geology, ecology, meteorology and drug discovery. More details about these projects can be found here.

                                                Research as a Service
From an informal survey of these projects, we learned researchers value the concept of using an on-demand, scalable compute resource over acquiring, deploying and managing a dedicated resource. Ninety percent of these researchers were pleased with their ROI using cloud services to build their application and would use cloud resources again. Of course, this sample is biased. These researchers are, for the most part, the leading edge risk takers and early adopters.
The most enthusiastic responses to our survey came from research teams with limited access to large scale computational resources. These were also the research teams with a community of users who had a pressing need for help from the cloud. For example, a research team at the Geography of Natural Disasters Laboratory at the University of Aegean in Greece built a cloud application that can be used to simulate wildfire propagation. The end users of the service are primarily emergency responders, including the fire service, fire departments and civil protection agencies that must deal with wildfires on the island of Lesvos.
The idea of using the cloud to help broader communities extends to scientific disciplines. The vast majority of scientists don’t want to learn to program a cluster, a supercomputer or a cloud. They want to get on with their science. This describes the vast majority of the research community.
The problem faced by these researchers  results from the avalanche of data from instruments, digital records, surveys and the World Wide Web hitting every research discipline. We are witnessing the birth of the fourth paradigm of science where large-scale data analysis now stands as a peer of theory, experiment and simulation. Advanced visualization used to be the only way to spot the trends in massive amounts of scientific data. Now machine learning can recognize patterns in data collection that are far too large or complex to visualize. Advanced data analytics and machine learning are now being used widely in the commercial sector to understand how the economy works, how to make our online searches more relevant and how to help hospitals deal with high re-admission rates.
Science is brimming with opportunities for us to use these techniques on our exploding data collections. For example, techniques such as those developed by David Heckerman and his Microsoft Research team are being used to understand the genetic causes of diseases. They used 27,000 computing cores on Windows Azure for a Genome Wide Association study (GWAS) of a large population. While this is a remarkable computational achievement, one exciting outcome is that the results of the analysis are now freely available as a cloud service (Epistasis GWAS for 7 common diseases) on the Windows Azure Marketplace.      
Many public scientific data collections are growing so fast that there is no way for an individual researcher to download them. And, as their personal data collections grow, researchers are putting pressure on overstressed university resources to house and maintain them. Funding agencies around the world are starting to insist that data from publicly funded research should be made available to the public, but it is not clear how we will financially sustain all these data collections.    
Fortunately, high-quality data is very valuable, provided it is easy to access and analyze. Given a highly valuable research collection and user friendly cloud analysis tools such as the Epistasis GWAS service, the research community may be able to support it through modest subscriptions. A great example of this model is the Michigan Inter-university Consortium for Political and Social Research (ICPSR), a high quality data collection run by an expert curation team that has existed for 50 years and is funded by a combination of grants and dues paid by hundreds of member institutions.   
 
Working with Internet2, we have already begun discussions with research community members about how we can build cloud services for science. Our long-term goal is to create self-sustaining scientific research ecosystems of data and services based on community data and community supported tools. Science is driven by a combination of trusted open source tools and high-quality commercial software. We can make these widely available in the cloud as services. In addition to supporting the data and computation, we believe expert users will also have a platform to create and market indispensable research services. In the cloud world, we have “Infrastructure as a Service”, “Platform as a Service” and “Software as a Service.”  Why not “Research as a Service”?

Cultivating a Cloud Classroom: How To Use Google Docs Offline


To enable Google Docs for offline use, sign into your account and click the sprocket icon in the upper-right corner. Then select "set up docs offline." Google Docs will then launch a dialogue box asking you to confirm that you want to enable docs offline. 


                                      


If already have Google Drive installed, you're finished with the set up. If you don't have Google Drive installed, you will be prompted to do so. If you need help setting up Google Drive on your Mac or on your PC, please see the directions that I have included in my guide to Google Drive and Docs for Teachers  source : Free Technology for Teachers 


Technology investors betting big on cloud computing startups on hope of strong returns


Technology investors are raising the tempo of investments in cloud computing startups buoyed by strong returns and growing customer demand for software as a service.





This week, venture funds closed two more deals in the sector with Norwest Venture Partners putting in $6 million (about Rs 32.6 crore) in first-round funding for Attune Technologies. The Chennai-based startup uses cloud technology for scheduling, billing and management of patient data with a base of 2 million patient records. Source Economic Times

Cloud Computing Pioneers: Frank Frankovsky


Frank Frankovsky worked as Dell's director of Data Center Solutions during the crucial period of 2006-2009, building up the hardware maker's ability to sell rack-mount servers to search engine and Web service companies seeking to build new, more efficient data centers. The unit's been a key, behind-the-scenes business that has kept Dell a leading player in server hardware. If Data Center Solutions had been broken out as a separate business, it would have been the number-three seller of servers in the U.S. in early 2010, Dell executives told InformationWeek during a visit to the Dell campus. In October 2009, Frankovsky become director of hardware design and supply chain at Facebook during a crucial period in its expansion. 



While there, he advocated that cloud server design be based on publicly pooled intelligence, despite Google's insistence that its server and data center designs were a competitive advantage. In April 2011, Mark Zuckerberg and other Facebook officials announced the launch of the Open Compute project to set standards for efficient cloud servers. "The benefits of sharing so far outweigh the benefits of keeping it all closed," Frankovsky told Venture Beat in July 2012. As an organizer of the OpenCompute.org project, Frankovsky helped pull in innovative and potentially competing projects behind the Open Compute standard. Financial services companies had watched the Google example and sought cloud computing servers of their own. Intel and AMD had been asked by their Wall Street customers to produce their version of a cloud server, examples that were donated to the new organization. "What began a few short months ago as an audacious idea -- what if hardware were open? -- is now a fully formed industry initiative, with a clear vision, a strong base to build from and significant momentum," Frankovsky wrote in an Oct. 27, 2011 blog. Source: Information Week

Cloud Computing Pioneers: Urs Holzle


Urs Holzle

The phrase, "the data center as the computer," comes so close to capturing what a cloud data center is about that a tip of the hat has to go to Urs Holzle. The senior VP for technical infrastructure at Google led the design and build-out of the search engine's supporting infrastructure and supplied a pattern for Amazon, Microsoft, GoGrid and others to follow.

As one of Google's first 10 employees, Holzle refused to be caught in the limits of what was then available from technology providers. Servers hadn't been designed for the cloud data center, so Google manufactured its own, according to the tenets that Holzle laid down. A Google data center is designed to use about half the power of a conventional enterprise data center.
In 2009, Holzle and fellow Google architect Luiz Andre Barroso captured in a Google whitepaper the concepts essential to building a worldwide string of search engine data centers. It was called "The Datacenter as a Computer: An Introduction to the Design of Warehouse-Scale Machines."
Holzle is a former associate professor of computer science at the University of California at Santa Cruz. He received a PhD from Stanford in the efficient use of programming languages. He is co-sponsor, with VMware CEO Pat Gelsinger, of the Climate Savers Computing Initiative, and he co-authored a second paper with Barroso, "The Case For Energy Proportional Computing," which outlines ways for servers to use only the energy required to execute the current workload. The paper is credited with pushing Intel and other manufacturers to find ways to adjust the current consumed by their chips. Source Information Week

The Hyperconnected World: A New Era of Opportunity


The hyperconnected world is today’s reality. No longer are we in a world  where consumers and employees “go online” to work, play, or purchase;  we are now in a world where everyone and everything simply is online,  whether at home, at school, at the office, or on-the-go. This new era brings with it an acceleration of innovation and disruption. It’s a world filled with  opportunity for those willing to embrace it and able to tame it. All around  us, across every industry, companies are discovering new audiences, creating new revenue streams, building new ecosystems, and inventing new business models – all online, all at an unprecedented pace. The Internet has evolved  from being a “nice-to-have” – an additional channel for growth – to  becoming the channel for growth and innovation. 


Cloud Computing Pioneers:Marc Benioff

Marc Benioff

Marc Benioff, CEO of Salesforce.com, stands out as the pioneer and guerrilla marketer of software-as-a-service. He drew attention to the concept at a time when it was widely disregarded as an aberration of limited use by brazenly advancing the concept of cloud services as the "death of software." He meant that on-premises software, the systems that have been making enterprise data centers run since 1964, were going away, replaced by software running in a remote data center accessible over the Internet.


Much has already been written about the successful establishment of Salesforce.com, which doesn't need repetition here. But for his role in winning respect for the concept of SaaS, no one matches the standing of Benioff. Source: information Week

Cloud Computing Pioneers: Chris Kemp


Chris Kemp

In the early days of cloud computing, NASA CTO Chris Kemp took several leading concepts of how to assemble a low cost, horizontally scalable data center and put them to work at the NASA Ames Research Center in Mountain View, Calif.


One concept was placing banks of standard x86 server racks in a shipping container with one power supply and network hookup. The container was dropped off by supplier Verari, and hooked up and ready to start accepting workloads in a few days, compared to the long time it takes to construct a new, permanent data center. He also ensured a close tie-in to MAE-West, a major Internet access point, which NASA already had at Ames.
Kemp initially created the Nebula cloud project to collect big data from NASA research projects, such as the Mars mapping project. But Kemp also conceived of a mobile cloud data center that could be transported to different locations to provide onsite compute power, no matter where a spacecraft was launched or an interplanetary mission was managed.
Kemp also advocated sharing NASA data, and both Google and Microsoft have used telescopic images and mapping from the Mars Reconnaissance Orbiter to create public image libraries online. He also initiated the OpenStack open source code project when NASA sought to team up with Rackspace to combine cloud computing software assets.
In March 2011, Chris Kemp resigned his post with NASA, an agency with which he had dreamed of working since he was a child, to become founder and CEO of Nebula. He was leaving, he said, "to find a garage in Palo Alto to do the work I love," a turn of phrase that showed he would be equally at home walking the halls of Congress or working the venture capital hallways of Menlo Park, Calif.
Not an imposing figure in stature, he is nevertheless an indomitable one. In a debate among Eucalyptus Systems, Citrix CloudStack and OpenStack at GigaOm's Structure 2012, Kemp, speaking for OpenStack, was hemmed in by CloudStack's Sameer Dholakia and Eucalyptus' Marten Mickos, who seemed to have jointly aimed their sharpest comments at OpenStack. In answer, Kemp declared that he would be on the stage the following year without either of them as OpenStack grew larger. It was a brash, if not rash, comment, but one that nevertheless brought a moment of breathing room in which to talk about OpenStack capabilities and momentum. Source: Information Week

Cloud Computing Pioneer: Rich Wolski

Rich Wolski

Rich Wolski is the co-founder and CTO of Eucalyptus Systems who decided that Amazon's public cloud APIs were so important that they should have open source code counterparts -- even if Amazon Web Services was against it.


He has been criticized on several fronts. One, his approach to cloud computing was too narrow -- it was based only on Amazon's example and initiative. Another: if Amazon wished to make its APIs open source, it could do so; if it didn't, it could make life difficult for an open source project that was doing so.
Wolski ignored the critics and pushed ahead both with his open source code leadership and Eucalyptus Systems, which makes a stack of software for building private clouds with Amazon EC2 compatibility. Amazon executives, for years unresponsive to Eucalyptus' entreaties to join the open source project, announced in late May that Amazon would partner with Eucalyptus Systems as a provider of private cloud APIs. It was, finally, a blessing on Wolski's initiative.
Also a computer science professor, Wolski is a person of strong convictions who believes the world will convert to a new style of computing -- and that Eucalyptus is destined to play a role in the conversion. Source: Information Week

Cloud Computing Pioneers: Lew Tucker


Cloud Computing Pioneers: Lew Tucker

Lew Tucker

Lew Tucker already had 20 years of software development and engineering under his belt when the cloud era rolled around. He was quick to recognize that his previous projects were pointing in the cloud's direction.


He had been CTO and VP of engineering at Radar networks, producer of the Twine social network and VP of the AppExchange at Salesforce.com. His big-company experience brought a different voice to the debate over cloud, one of experienced and toughened engineering that said cloud not only could be, but also should be the next wave of computing.
Tucker was CTO of cloud computing at Sun Microsystems in 2008-2010, a crucial period when Oracle acquired Sun, and where his depth of knowledge countered Oracle's fatuous putdowns of cloud computing. After the acquisition, Oracle CEO Larry Ellison interviewed Tucker; Tucker said it took only three minutes before both men had made up their minds. In that short time, Oracle lost one of the few spokesmen capable of rolling back the skepticism that Oracle would ever be serious about cloud computing, something that it's still reaching for as it reverses course and wades more deeply into the field.
Tucker is now CTO of cloud computing at Cisco Systems, a tireless advocate (and board member) for OpenStack and an ignorer of boundaries -- as long as the other party can talk about cloud computing. At the recent Cloud Expo, he ducked into a meeting room to pay his regards to Rich Wolski, head of the Eucalyptus open source project at the University of California at Santa Barbara. Eucalyptus might be painted as an OpenStack competitor, but in Tucker's eyes Wolski's simply another passionate cloud enthusiast. He does the same on the OpenStack board of directors, where he's part of the social cohesion that holds competing members together. Source : Information Week

Cloud Computing Pioneers: Jonathan Bryce


Jonathan Bryce liked working with computers as a youth and had an older brother who was one of Rackspace's first 12 employees. He urged Jonathan to work at Rackspace, and Bryce became familiar with many phases of the operation, from racking servers to customer service and technical support. He partnered with website designer and friend Todd Morey to host sites on their own rented servers in Rackspace. They left Rackspace in 2005 to branch out into their own website building and hosting business, Mosso Cloud, named for an Italian musical notation phrase that means "to play faster and with more passion."


But Mosso still ran on servers in the Rackspace data center. Rackspace executives saw the relationship between its hosting-services business and emerging uses of cloud computing, so they asked Bryce to keep building out the Mosso Cloud. He had a system that could launch applications on a website and was thinking about a virtual machine launching system. Then Rackspace bought Slicehost, which already had such a system. Its virtual machine management became part of Mosso, and Bryce rejoined the company as the head of Rackspace Cloud.
Rackspace attempted to expand its cloud computing business by distinguishing itself from the market leader, Amazon Web Services. It offered smaller, get-started virtual servers, at $0.015 an hour. And it opened up its cloud API, prompting NASA to propose that they combine their cloud efforts in a joint project, OpenStack. By 2009, Rackspace saw OpenStack as both the means of spreading a common cloud computing base in private companies that could interoperate with Rackspace, and a means of changing the terms of competition with Amazon.
Rackspace led OpenStack as a sponsor, but realized it would have greater appeal as a more broadly sponsored project. It turned over management to the newly formed OpenStack Foundation in September. Both Cisco's CTO of cloud computing, Lew Tucker, and Red Hat's Brian Stevens, both members of the foundation's board, said Bryce was their top candidate to become its executive director, a post he accepted. At age 31, he's an innovative spirit with implementation experience who asserted himself when it still wasn't clear which direction cloud computing would follow. Source: Information week 

Researchers find new way to perform computing tasks with cloud browsing


Cloud based browser could allow mobile devices like smartphones with limited computing power to perform large scale computing tasks

Researchers from North Carolina State University and the University of Oregon have found a new way of performing large scale computing tasks through cloud based browsers.

                                                          Figure: Cloud Browse. Picture credit
The researchers claimed that the cloud based browser could allow mobile devices like smartphones with limited computing power to perform large scale computing tasks.
Cloud browser can create a web interface in cloud though which computing tasks can be performed in the cloud instead of performing it on the devices. Complete story here 

Cloud Computing Pioneers: Randy Bias


Randy Bias

Randy Bias, cofounder and CTO of CloudScaling, has been a specialist in IT infrastructure since 1990, which positioned him to think through and lead some of the leading cloud computing innovations. He was a pioneer implementer of infrastructure-as-a-service as VP of technology strategy at GoGrid, a division of hosting provider ServePath. GoGrid launched a public beta of its Grid infrastructure in March 2008.

He pioneered one of the first multi-platform, multi-cloud management systems at CloudScale Networks and went on to found CloudScaling, where he was a successful implementer of large-scale clouds based on a young and unproven open source code software stack, OpenStack. Those large-scale clouds included KT, the largest cloud service in Korea (formerly known as Korea Telecom), and big data center services provider Internap.
Part of the support OpenStack receives is based on these implementations, and Bias was elected as one of eight gold-sponsor board members of the OpenStack Foundation. He keeps an unvarnished point of view on cloud claims and cloud pretensions, and is known for his uncompromising point of view. In 2009, he advocated the efficiencies of cloud computing as a way to counter climate change.
The O'Reilly Radar blog says Bias "led the open licensing of GoGrid's API, which inspired Sun Microsystems, Rackspace Cloud, VMware and others to open license their cloud APIs." Source: information week

Report Abuse

Pages TESTING

Powered by Blogger.

Blog Archive

Featured Post

A b b u d A ! India  Search for Government services at all levels of the Nation: Central, State, City and Village.   A b b u d A ! ...

Categories