miércoles, 19 de abril de 2017

Automotive operating systems

Automotive operating systems
Automotive operating systems
Automotive operating systems

Automotive via CNN Money and C | Net to the same news from the Daily Yomiuri detailing the formation of a consortium of Japanese companies and institutions (Toyota, Nissan, Honda, Toshiba, Denso and the Ministry of Industry, Commerce and Energy, among others) In order to develop an operating system for use in automotive. The system is currently called JasPar, and the objective would be to have a functional prototype in 2009.

Following a little more news, I find that 70% of the market for automotive operating systems is in the hands of OSEK / VDX, A German-French consortium formed by the union of the developments of OSEK ("Offene Systeme und deren Schnittstellen für die Elektronik im Kraftfahrzeug", originally composed of BMW, Bosch, DaimlerChrysler, Opel, Siemens, VW and the University of Karlsruhe as coordinator; And VDX ("Vehicle Distributed eXecutive"), coming from the French automotive industry, specifically PSA and Renault.

It is estimated that electronic components and software already account for around 20% of the cost of an average car, a percentage that reaches 50% in hybrid cars. In the eighties, the number of electronic systems installed in the average car remained stable around five, while at present it is about thirty in average, with some luxury models that exceed one hundred. Such demand is provoking the interest not only of the aforementioned consortiums from the industry but also of technology companies such as IBM or Microsoft.

The evolution of this type of operating system could have enormous consequences for the establishment of communication standards that would largely avoid the escalation of such costs through joint research, as well as a strong impact on the aftermarket service market. At present, brands jealously protect their electronics for competitive and security reasons, which entails enormous difficulties for the workshops and for the development of compatible parts and products, but in certain parts of the industry it is beginning to advocate the convenience of Towards open-type developments.

What is this "Windows Cloud"?

What is this "Windows Cloud"?
What is this "Windows Cloud"?
What is this "Windows Cloud"?

A quick mention made by Steve Ballmer last Wednesday is generating a lot of intrigue and interest: a pre-announcement of something that Microsoft's CEO named Windows Cloud, which would be designed for programmers writing cloud computing applications, and who would be unveiled by the Company within one month. The details, ComputerWorld or Slashdot.
The most common reaction is skepticism. Can Microsoft, the company that showed itself unable to react to the arrival of ultraportable more than resurrecting an eight-year old Windows XP, suddenly put on the market a light and cloud-oriented operating system? That a product whose launch is expected for a month does not yet have a definitive name, and assuming a change of strategy so important did not know anything so far does not seem the usual style of Microsoft. According to Ballmer,

"We need a new operating system designed for the cloud and we will introduce one in about four weeks, we'll even have a name to give you by then. But let's just call it for the purposes of today's "Windows Cloud." (...) We're not driving an agenda towards being service providers but we've gotta build a service that's Windows in the cloud "
The ultra-portable ultra-portable market boom, already qualified as a "netbook revolution" and with expected sales of two hundred million units by 2013 (the same size as the entire market predicted for conventional laptops), could be pushing Microsoft to change Its habitually slow pace of development for something more reactive, able to counteract a Google whose latest move, the launch of Chrome, clearly see what we discussed at the time: a threat designed to compete with Windows (incidentally, thanks to All those who in those days came to "explain me kindly" in the comments the concept of operating system, as if I, who had been teaching Information Systems for eighteen years, did not know ... should be the first time I'm glad to agree In my appreciations nothing less than with Steve Ballmer :-)
"If you talk to Google they'll say it's thin client computing but then they'll issue a new browser that's basically the big fat operating system designed to compete with Windows, but running on top of it"
Without a doubt, a powerful reason to try to dynamize the portfolio of Microsoft products by putting on the table something more than the old XP, which contrasts with the fact of being an environment known to many, the evidence of being a system clearly not designed, for Orientation and philosophy, for the characteristics of an ultraportable. Will Microsoft be able to give a response as quick and interesting as what Ballmer has seemed to imply, and get out of the sleeve a light and work-oriented web-based operating system, or we'll be simply talking about something as cloud-like as Vaporware

martes, 18 de abril de 2017

Watson, between artificial intelligence and Skynet

Watson, between artificial intelligence and Skynet
Watson, between artificial intelligence and Skynet
Watson, between artificial intelligence and Skynet

When, on May 11, 1997, Deep Blue was able to defeat Garry Kasparov to chess, many began to look at computers differently. The feeling of unease was mitigated by some nuances: the victory had been tightened (two wins, one defeat and three boards), the fact that IBM did not accept a new game requested by the Russian champion and decided to disassemble the machine, or the That the excellence of the same was reduced to a reduced scope of the human intelligence like the one of the chess made that many considered that to that of the artificial intelligence still lacked a few boils.

In 2011, a new development of IBM, Watson, defeated in Jeopardy! To the two best players of all time in that contest, and many began to shuffle the idea that a machine could be better than humans in a game involving not only knowledge, but also intelligence when it comes to Interpret questions and answers made in a given context. Watson's victory this time not only did not admit of nuances, but had taken place in the presence of limitations as powerful as the fact of being disconnected from the internet, operating only with the information in his memory.

Last January, IBM installed a Ratsselaer Polytechnic Institute at a private university in New York, a Watson-like machine with access to a huge 15TB database, with the idea of ​​equipping the machine with new skills and preparing it for its Development in new industries. At the moment, it is being experimented in areas such as finance, information technology, business analytics or medicine, in which access and instant analysis of a vast amount of documentation that would be completely impossible for any human to process allows The resolution of complex scenarios subject to a multitude of constraints. To get an idea, you can watch this video of IBM on a hypothetical application of Watson to a treatment of a cancer patient:

Indeed, a very significant part of what we call intelligence is linked to the ability to locate relevant information in a very broad database: what Watson is able to do with its massive processing and storage capacity is simply to expand this Ability to almost infinite, as a doctor who had read, understood and memorized everything published in a particular subject and could also remember it almost instantly and with complete precision. Moreover, the machine can even use its own analyzes and previous results as inputs, leading to a concept very similar to that of human learning as such.

The border now moves to a new type of analysis, as "the machine goes to school": that of data that includes, for example, information and context data generated by one or more social networks in real time. What happens when the ability to enter information into Watson's memory becomes virtually unlimited? What happens when we go from feeding Watson in a controlled manner with academic information to doing it with anything that he can find and classify for himself on the internet, not simply real-time data from a particular source or controlled sources, but all kinds of information?

The evolution of the technology industry towards the cloud

The evolution of the technology industry towards the cloud
The evolution of the technology industry towards the cloud
The evolution of the technology industry towards the cloud

Technological take on the cloud - Five Days

Marimar Jiménez writes a piece on technology industry trends in Cinco Días, "Technological Cloud Compromise" (pdf), which reflects some of my comments on Dell's recent acquisition of EMC2, and For whose development he sent me some questions via email.

What we are experiencing is precisely this phenomenon: the development of the cloud by companies initially removed from the so-called "corporate computing", such as Amazon or Google, coupled with the subsequent but decisive leap given by Microsoft in the same direction, is Leading to many of the companies that were dedicated to the sale of hardware to that market are completely out of play, pending the renewal of millionaire contracts that, in most cases, will never arrive. Yes, the cloud is obviously based on hardware, but that hardware is no longer marketed by the HP, EMC2 and Dell in the world, but by components much cheaper than the Amazon itself, Google or Microsoft develop for themselves with factors Critics, fundamentally, minimize their cost and maximize their flexibility.

The result is a sharp shifting of value from the "iron" market to that of services, which many are still trying to understand, and which leaves out more than one.

Then the questions and answers I exchanged with Marimar:
Q. What are the major trends that are setting the pace of the IT sector today?

A. Everything indicates that the IT sector is marked by the transition towards cost variability, outsourcing, lightness and the reduction of investment in fixed assets. Large corporate contracts with hardware companies go down in history, and customers now choose to rent capacity instead of buying machines, by datacenters maintained by specialists, that allow to pay only for the capacity of processing, storage and the bandwidth Which is required at all times, which offer a much higher security (in the hands of specialists) and much more costs.

Q. It seems clear that "iron" is losing weight in favor of services and software, but this phenomenon is the same in the consumer segment as in the business segment?

A. The hardware is a business already practically commoditized, with the great majority of its processes relegated to the efficiency in costs, with generally very scarce margins and little money to win. Compete in this segment is exhausting, and only a few brands do well. In the business segment, the large servers of years ago are moving to the cloud, and the supply contracts involved disappeared.

Q. What companies and why are they better understanding the evolution of the sector towards the intangible?

A. Without a doubt, Amazon, who developed a product from their own needs and subsequently made it into a market leader and continuous learning experience. The growth of Amazon Web Services is impressive, a true case study, and says a lot about the flexibility of a company that started selling books. Microsoft, on the other hand, has carried out a very important refocusing process, has the cloud as one of its top priorities, and is getting very competitive in that area.

Q. And who are the big losers?

A. The big losers are the protagonists of what was called "corporate computing", the suppliers of machines, components and software that now happen to be in the hands of a third party that rents them. Large cloud competitors develop their own technology, scale better because of their focus, and are often not customers of corporate suppliers because they tend to create their own infrastructures to gain control over all aspects that can be optimized.

Q. How does it fit in all this evolution concepts that seems to be marking the roadmap of all companies in the sector, both hardware and services or software such as the Internet of things, 3D printing, robotics?

A. The internet of things requires generally simple and inexpensive components, and connectivity alternatives that make sense. A chip tucked into a bicycle to be able to eventually locate if you are stolen can not connect just like a smartphone, because the needs are completely different. This marks a focus on very light services, almost omnipresent, with narrow margins to boost the creativity of entrepreneurs.

Q. In the end, with this sector towards the intangible, will companies be imposed that do everything, such as Google, Facebook, Amazon or Apple, or the very specialists? Or both?

A. The ecosystem conforms to large companies operating in global environments, surrounded by an entrepreneurial fabric that develops alternatives based on a close proximity to the customer, creativity and needs detection, etc. I do not think that changes especially, it just becomes more efficient.

Q. And in this context, what about the challenge of the European digital single market? It seems that progress and suddenly not, with the last sentence on the transmission of data, for example.

A. The European digital market is a huge need because its lack of development deprives companies of the continent of a huge advantage, a field of development close to reach scale and competitive advantages. 

miércoles, 12 de abril de 2017

Head in the cloud

Head in the cloud
Head in the cloud
Head in the cloud

It could not be clearer: the cloud is the next big topic of discussion. To Google's strong position in this area for several years with its approaches and products, is already joined by a serious offensive by Microsoft disclosed in a Business Week article, "Microsoft to Google: Get Off My Cloud" In which the company announces an ambitious plan to build in the next years a network of twenty latest generation data centers with a projected cost of more than one billion each, designed to "reinvent the infrastructure of our industry" and face the "Good work done by Google to create hype around the subject." In an industry where its two main sources of competitive advantage, licensing fees and the push of the PC industry as a distribution channel seem to go down in the background, the idea of ​​bringing to the cloud under leasing products like Office Or Exchange seems to have more and more sense.
But Microsoft is not the only major player in putting its head in the cloud: IBM announces a "Resilient Cloud Validation program", a kind of "seal of approval" designed to build trust in those companies that are thinking about their transition to this As well as to offer services to both companies that decide to build their own cloud and those who prefer to rely on third-party infrastructure: build in, or rent out, the everlasting issue with the legions of IBM consultants willing to contribute doctrine.
Between the two news, there is not the slightest doubt that the whole industry can be prepared to go and put the head in the cloud. A trend announced for a long time, which takes advantage of the evidence of the crisis through promises of lower management costs and infrastructure. The clouds are made of steam, but this one seems to be taking more and more body.

Chromebook: New Era?

Chromebook: New Era?
Chromebook: New Era?
Chromebook: New Era?

Google reopened yesterday the informational interest of the technological world in its Google I / O with the announcement of the launch of its Chromebook, which for many supposes the culmination of the strategy of the company in an all-in-one solution of hardware, software and services.

Seeing the launch of Chromebook as a simple product, like a computer more, is a very unambitious approach. In fact, it is a complete redefinition, the first serious step in the idea of ​​transforming computing that we know from a computer-centric to a red-centric scheme, with all that that can entail. Obviously, it lacks the fundamental step, the great doubt today: to know if the market is prepared for something like this, whether it will host such devices with enthusiasm or, conversely, will be one of many flops in Google history as company. On the horizon, the presentation of three types of offers: for the consumer market, for companies and for students, with different schemes and prices. In the first case, a payment for the machine of $ 429 in basic version without 3G and $ 499 with 3G (and 100MB of data traffic with the operator of shift - they are consumed in a sigh). In the second, a monthly payment of $ 28 for companies or $ 20 for students that includes maintenance and updating of the system and a guarantee.


The idea is to raise the cloud not as an option, but as an advantage. Something as complex as a series of things that a very significant part of the market perceives today as limitations are seen as something positive, as a guarantee of security, portability or versatility. The change of perception is not simple, even for end users who have spent almost everything in the cloud for years, nor for a corporate market that is still being raised if Google is the partner that they want to entrust the entire operation of their business. In that sense, Chromebook is a very risky product: the presentation in the Google I / O is made before an enthusiastic and apparently convinced public, but the serious approach of purchase is another thing. In fact, in my tests with the product last February, I quickly went on to value positive aspects such as versatility / flexibility (like "I just sit in front of it and feel it as my computer") to limitation Are my files, where is the administrator - it seems that they have improved it - or what happens if the connection falls "). No, it's not a simple transition or it's going to be a two-day thing. After decades of getting used to the market that "the more powerful the better," the Chromebook is such a brutal change of model that my impression is that Google will find more resistance than expected to make Chromebook a success. It's those things that you can analyze analytically, whose advantages you can value or even you can turn into numbers with some solvency ... but that as a decision, it costs you to take more than it seems.

Does Google play with Chromebook a lot? My impression is yes. And I do not mean economically. Of course, great success and a legion of customers paying for cloud services would mean a very positive revenue diversification for a virtually uniproduced company today. But where Google really plays it is in its prestige as a company capable of setting trends. For Chromebook to be considered a success does not count with a sales success. You need a revolution similar to that, for example, Apple could assume with the launch of your iPod or iPhone. Something that really becomes a model for others, that is imitated, that talk about it as a before and after. You need to go from what today are two models (one for me, since Acer as a brand lost all its solvency with the astonishing quality of its Aspire One), to an entire ecosystem of competitors throwing not only machines, but also offers that compete with The Google cloud to the same level. If it is not, if within six months Google is practically alone in this market, we will not be talking about anything other than another flash, another product ahead of time and its market. These kinds of successes, these radical model changes, are not decidedly easy to achieve.

At stake, the actual consideration of the cloud and the validation of your model face to the end user. It is not a bet we see every day.

Hyperconnected world

Hyperconnected world
Hyperconnected world
Hyperconnected world

Yesterday, a lightning strike outside Dublin caused an incident in the transformers that supply electricity to the data centers that Amazon and Microsoft have in the Irish capital and with those who service, in the case of Amazon, its platform Of cloud computing EC2 and, in the one of Microsoft, to its BPOS (Business Productivity Online Services). The result was the fall or degradation of the service of numerous websites across Europe that use these services, a case that is not the first time or obviously will be the last.

Meanwhile, a Microsoft Research project raises something as interesting and interesting as relocating and distributing data centers in private homes in the form of heating appliances: servers-electric heaters that you would have in your home, while the company subsidizes the electricity needed to Keep them operational and, collaterally, to warm your home. And related, a study reveals that Google handles about nine hundred thousand servers, and that its energy impact is considerably smaller than expected due to both the economic crisis and the intensive use of technologies such as green IT and virtualization.

The last stupid idea: breaking the cloud to canon blow

The last stupid idea: breaking the cloud to canon blow
The last stupid idea: breaking the cloud to canon blow
The last stupid idea: breaking the cloud to canon blow

A news item in CIO comments on what appears to be the penultimate occurrence of the insatiable management entities, being studied by the European Commission: to levy with a canon the use of services in the cloud to supposedly "compensate" the creators for the circulation and use Of his works through this medium.

According to the twelve copyright management entities that contacted the Commission on Wednesday, the private copying fee is a fundamental system for the remuneration of artists, has proved not to be a disincentive to the acquisition of devices in the countries in the Which has been put in place and must be reconverted to meet its new challenge: operating in an environment where physical copy is no longer necessary and both circulation and consumption takes place through platforms in the cloud. The approach, apart from being the umpteenth exponent of the voracity to collect and continue pretending that for some mysterious reason has to subsidize a number of businesses because they complain about the progress of technology, would be extremely problematic in the case of the free section of services Of freemium use like Dropbox, GDrive, Skydrive and the like.

The idea of ​​paying a fee for using platforms in the cloud just in case we uploaded copyright works is as absurd as claiming to charge a fee for the air we breathe, just in case we have the idea to whistle a song. But of course, there is no lack of entities capable of raising it, nor politicians willing to listen. The war of the canon returns. Let's grab, come curves.

martes, 11 de abril de 2017

Telefónica makes technology to analyze big data available to companies

Telefónica makes technology to analyze big data available to companies
Telefónica makes technology to analyze big data available to companies
Telefónica makes technology to analyze big data available to companies

Telefónica offers companies the possibility to develop big data projects without investing in infrastructure and with the best advice through a Hortonworks data platform from Hadoop. Hewlett Packard Enterprise (HPE) and Telefónica – Partner platinum of HPE – unite their digital transformation strategies for large companies in order to provide them with the tools needed to address their digitalization. A goal for which Telefónica has created living Cloud, a global proposal that covers all the essential technological and methodological elements in this process, where the big data is an essential pillar.

Telefónica puts at the disposal of large companies the services of its platform of big data Hortonworks of Hadoop, supported by Hewlett Packard Enterprise Technology (HPE), so that companies can optimize their data without investing in infrastructure and advice, which empowers them to improve their resources and business processes, expedite decision-making or undertake new business challenges.

In an increasingly competitive market context, data is an asset of great relevance to companies. Its predictive analysis helps reduce production costs by optimizing critical parts of its processes, improving the experience of its customers and even embarking on new business challenges, thus improving its market position. A key step to address the already known and necessary digital transformation in companies.

Big data: The key is in the analysis
The Hadoop Hortonworks data platform with Telefónica has two functions: on the one hand, it allows organizations to collect and store in a repository all kinds of information; On the other hand, launching processes of data transformation, predictive analytics and prescriptive on this same information; All this is essential for the use of big data in large companies.

The service of big data that Telefónica offers to the companies is based on an independent platform that guarantees the greater performance and security, since it is built on a reference architecture, created from the knowledge and the experience of Telefónica, and allows to grow and decrease maintaining the performance.

In addition, Telefónica offers technology consulting companies to keep their data management and extraction processes always up-to-date, as well as advanced analytics services with the applications that best suit their needs.

Big Data: a little introduction

Big Data: a little introduction
Big Data: a little introduction
Big Data: a little introduction

I have been collecting information about Big data for some time and introducing some notions about the subject in some of my courses, but today while I was preparing a conference I realized that it was an issue that we had not yet mentioned on the page, despite being One of the most current trends in the industry.

By Big Data we mean exactly what its name implies: the treatment and analysis of huge data repositories, so disproportionately large that it is impossible to treat them with conventional database and analytical tools. The trend is in an environment that does not sound strange at all: the proliferation of web pages, image and video applications, social networks, mobile devices, apps, sensors, internet of things, etc. Capable of generating, according to IBM, more than 2.5 quintillones of bytes per day, to the point that 90% of the world's data has been created during the last two years. We speak of an absolutely relevant environment for many aspects, from the analysis of natural phenomena such as climate or seismographic data, to environments such as health, safety or, of course, the business environment. And it is precisely in this area where companies develop their activity where an interest is emerging that makes Big Data something like "the next buzzword", the word that we will surely hear coming from everywhere: technology vendors, tools, Consultants, etc. At a time when most managers have never sat in front of a simple Google Analytics page and are powerfully surprised when they see what they are capable of doing, there comes a panorama of tools designed to make things immensely larger and more complex make sense. Be afraid, very afraid.

What exactly is behind the buzzword? Basically, the evidence that the analysis tools do not arrive to be able to convert the generated data into useful information for the business management. If your company does not have a problem with data analytics, it is simply because it is not where it needs to be or does not know how to get information from the environment: as soon as we join traditional operations and transactions issues such as an increasingly intense bi-directional interaction With clients and the web analytics movement generated by social networks of all kinds, we find a scenario in which not assuming a major disadvantage with respect to those who are. It is simply that operating in the environment with the greatest data generation capacity in history entails the adaptation of tools and processes. Unstructured, unconventional databases that can reach petabytes, exabytes, or zetabytes, and require specific treatments for their storage, processing, or viewing needs.

Big data was, for example, the star in the latest Oracle OpenWorld: the position adopted is to offer huge machines with huge capabilities, multi-parallel processing, unlimited visual analysis, heterogeneous data processing, etc. Developments such as Exadata and acquisitions such as Endeca support an offer based on thinking big, which some have not hesitated to discuss: in the face of this approach, the reality is that some of the most focused companies such as Google, Yahoo! Or Facebook or almost all startups do not use Oracle tools and opt instead for an approach based on distributed, cloud and open source. Open source is Hadoop, a hugely popular framework in this field that allows applications to work with huge data repositories and thousands of nodes, originally created by Doug Cutting (who gave him the same name as his son's toy elephant ) And inspired by Google tools like MapReduce or Google File System, or NoSQL, non-relational database systems needed to host and process the enormous complexity of data of all kinds generated, and in many cases do not follow the logic of guarantees ACID (atomicity, consistency, isolation, durability) characteristic of conventional databases.

In the future: a growing adoption landscape, and many, many questions. Implications for users and their privacy, or companies and the reliability or real potential of the results obtained: as MIT Technology Review says, great responsibilities. For the moment, one thing is for sure in Big Data: prepare your ears to hear the term.

Surprise! The cloud may fail!

Surprise! The cloud may fail!
Surprise! The cloud may fail!
Surprise! The cloud may fail!

If you did not read the news, maybe you noticed how multiple services from different companies failed at the same time: Foursquare, Reddit, Hootsuite, Quora and several hundred more companies failed or suffered problems with the fall of the Amazon Elastic Compute Cloud (EC2), a cloud computing platform used by a large number of services.
The promise of system-independence that Amazon offered as a guarantee of redundancy and stability failed to fail: several systems located in geographically separate locations failed at once due, it seems, to an uncontrolled backup procedure that made countless copies of Itself, in a cascade effect that rapidly consumed all available space and gave rise to what has already been called the "cloudgate" or the "cloudpocalipsis". Something that, indeed, should never have occurred, and which raises doubts of all kinds about the maturity and development of cloud computing as a whole.
Or not? In fact, is it in any way different from the fall of a power plant? Or the failure of a drinking water supply station? If we know something about technology, it is impossible that it should not fail, and that what we must do is take the appropriate measures so that when it fails (not "if it fails", because failure is something that reaches the category of metaphysical certainty ), The effects of the fault are as low as possible. Electrical power fails in my house often enough so that years ago I decided to purchase a modest uninterruptible power supply (UPS) for domestic use, and I know that such failures are perfectly common in many people's lives, not just In Spain but in other countries in which I have lived. When it fails, it is a major nuisance in your daily life, if not a small catastrophe due to problems of all kinds. And if you call the company, they excuse themselves and basically tell you that, that is a failure and they can not do anything, that things fail from time to time. And we talk about services such as light or water, which carry with us many, many years, in which we trust fully and on which we build many aspects of our life, around a reliability that we take for granted.
Okay, the bug should not have occurred. As we have said on other occasions, the cloud is as good - or bad - as good - or bad - be your providers. There is no "cloud", there are companies that provide services in it. Companies in which to establish certain levels of confidence, risks to be estimated and valued, avoiding both one end (to be systematically uncovered) and the other (invest more than the risk can actually assume). Both the defect and the excess pose problems, which can range from interruption of service and loss of reputation to excess cost. Technology, oh surprise, may fail. If the possibility of that failure is crucial for your company, reduce it, preferably with different suppliers. A service like this blog that you are reading has several systems of immediate alert, several alternative procedures in case of fall within my hosting provider, Acens, and even so, despite receiving protocols of attention similar to that of Acens clients with a Criticity of service infinitely greater than mine, is even made a daily backup on Amazon. And that if all else fails ... it gives me practically the same, because the service provided by this page can be anything but critical. The possible impact of a full day's fall from my blog is practically nil, because the next day, my readers will surely continue to be there: I play every day much more depending on what may happen inside my head and on Consequence of leaving my keyboard, than what can happen inside my server.


The important thing is to consider a fall like this, happened at a time of low impact (during the holiday period and one of the lowest traffic days of the year) as something to learn. For Amazon, understand that failures - within an order - can happen, shit happens, but that should not fail other fundamental elements such as communication. For those who have critical truth processes with an important impact on the transactional, directly translate to economic value, which must be redressed to the extent that it can alleviate at least part of the possible damage, and that analysis is not a napkin count that Was done once in the service, but a dynamic analysis based on the different options available, the evolution of its cost, that of our operating volume, etc. A risk analysis, cost and benefit, which can not be neglected.
AWS has allowed us to scale a complex system quickly, and extremely cost effectively. At any given point in time, we have 12 database servers, 45 app servers, six static servers and six analytics servers up and running. Our systems auto-scale when traffic or processing requirements spike, and auto-shrink when not needed in order to conserve dollars. In the ten months since we launched the public beta of our free, self-serve gamification platform we have handled over one billion API calls. Without AWS, that simply would not have been possible with our small team and limited budget. Many others have realized similar benefits from the cloud, and AWS has quickly become a critical part of the startup ecosystem.

Keith Smith, CEO of BigDoor, affected by the fall of Amazon Web Services (AWS)

Indeed, Amazon Web Services (AWS) fell. No system is one hundred percent fault-free, and there are many lessons to be learned from all of this. But without Amazon, many things would simply be impossible. It is simply a cost versus profit balance.
For Amazon, the ruling is going to be an important loss. Many things can fail, but what should not fail is the essence of what you promised your customers (completely independent systems) and your communication with them. The cloud computing is in its beginnings, and we will see failures like the one of yesterday in numerous occasions. But as tangible as those same failures are their advantages in terms of scalability, flexibility, cost, performance, efficiency and many others, to the point of becoming fundamental advantages that define, for many companies, the true being or not being, the decrease Of entry barriers that make many things that would not otherwise be possible actually be possible. Which does not mean that, like everything, from time to time it may fail.

The majority of the age of the cloud

The majority of the age of the cloud
The majority of the age of the cloud
The majority of the age of the cloud

The opening talk of Apple's WWDC 2011 left, among many other things, clear evidence: Apple's official entry into the working mode that many have been using for a long time using various combinations of tools.

After several unfocused attempts such as MobileMe, the company gives way to the much talked about "post-PC era" with the announcement of iCloud, a service for which they have been conscientiously prepared with the development of three huge data centers ready to host documents , Photographs and music of all its clients, that will find the service ready to be activated.

The commitment to the cloud of a company like Apple marks the age of the cloud at the end user level: everything takes place in it. Is a new operating system announced? Nothing to do with sending DVDs, that's a thing of the past: you get into the cloud, you get a deterrent price of $ 29.99, and users download it from there through the App Store, a movement definitely aggressive that will result in High percentages of update. Do you make a photo on your iPhone? It auto-sends to the cloud, and is downloaded from other devices. E-mail, calendar, backups, iWork files ... everything is automatically reflected on the cloud, turning the devices into windows from which to access them, each with its limitations or specificities. Again, the Apple style: take a concept (call it "MP3 player", "mobile phone", "tablet", etc.) or, in this case, a way of working that for many is not new and organized Relatively unfriendly (or "out of anyone's reach"), and transform it into a simple and natural way of working for everyone.

Apple is obviously not alone in this. Companies like Google or Amazon have long worked at different levels to deliver cloud bits to their users, Salesforce and others offer ways to work in cloud for companies, Microsoft has long been announcing that about 70% of its R & D efforts They dedicate to the cloud, and the telecommunications companies also intend to launch their offers (there are included movements such as the announced acquisition of Acens by Telefonica, which we will talk about in another entry), a race to take to the end user and companies To the cloud. A movement that despite having been announced many times, needed a detonator, a competitor ready to launch the bells on the fly. The idea of ​​Apple is structured in freemium mode, with a combination of free and paid services that bring the company closer to the idea of ​​"service company": the hardware, core idea and competitive advantage of the company, almost " Sells only ", while users download apps and use services in the cloud.

Let's take the batteries: whatever looks like the cloud, we have the objections or justified fears that we want to have, it has just become of age, and is here to stay. And Apple is not the cause, it's the symptom, it's a cherry in a cake that has been cooking for a long time and many of us have been pecking for a long time. Soon, much more.

martes, 4 de abril de 2017

IBM report on the future of advertising: the end of advertising as we know it ...

IBM report on the future of advertising: the end of advertising as we know it ...
IBM report on the future of advertising: the end of advertising as we know it ...
IBM report on the future of advertising: the end of advertising as we know it ...

IBM Report on AdvertisingIBM Global Services has published an interesting report, titled "The end of advertising as we know it", which like the recently reviewed "Marketing Media & Ecosystem 2010" by Booz Allen & Hamilton (Outlined here) becomes mandatory reading for those interested in the world of advertising.

Basically, the conclusions of the study are those that we have been talking about for a long time, and those that led us to develop in the Instituto de Empresa the new Master in Digital Advertising and Communication of which I am Academic Director and that begins in February: that the world Of advertising is set to change as much in the next five years as in the previous fifty. Changes that affect attention, creative possibilities, the validity of metrics, the inventory of advertising available and, above all, the control and attitude on the part of the clients.

The report has a mixed methodology of survey and focus group, and highlights the progressive importance of interactive advertising, from the application of advertising to micromarkets and segmentations that reach the individual, multichannel and mobility, new creative formats And their interaction with the creations of the consumers themselves, etc. A truly interesting report that comes to reaffirm many of the conclusions and analyzes that many have been advancing for quite some time.

Cloud Computing Everywhere

Cloud Computing Everywhere
Cloud Computing Everywhere
Cloud Computing Everywhere

Four articles in this week's Business Week ("How the cloud computing is changing the world", "Cloud computing: small companies take flight", "Enter the cloud with caution" and "It's 2018: who owns the cloud? Of cloud computing, after a first article of introduction published last April, "Cloud computing: eyes on the skies". And when a term comes to Business Week, the bible of many directors, in the way it has done this, it is an unmistakable sign: you'll get tired of hearing about cloud computing. To such an extent the term is seen as transcendent for the future, which some have even tried to patent it ...

No doubt, the issue has an important importance: as Hugh Macleod comments, cloud computing is the real important battle at the moment in the technological scene: the companies that dominate "the cloud" will be the true actors of the future, with schemes of Concentration due to the very nature of the activity. To those who have been putting our data, our thoughts, our images, our social relations and our entire life in the network, cloud computing is anything but surprising, practically a logical choice derived from working from many sites or many machines . But for companies, things go much further: coming from a past built around applications installed on desktops, brutally oversized and capable of doing many more things than any employee of the company could ever want to do , Thinking about a future of minimalist applications residing on the network and accessible from anywhere is almost anathema. Until you start to try: what is the point of putting each license employee in the hands of a few hundred dollars, so that you can prepare documents in the most sophisticated way possible, when all we need is to write and share easily What he writes with those who work with him? Thus, companies that test applications of this type suddenly find themselves with surprising levels of productivity and satisfaction, and with schemes of work that become much more logical the more they are used.

The idea of ​​companies using application, processing and storage infrastructures in the hands of specialists is not new, comes from the idea of ​​the utility computing of the sixties, when a computer was a very expensive resource that had to be shared. Although the reason has changed, the concept remains the same, and in the light of advances in virtualization, communications, security, and scalability architectures, it becomes much more meaningful. Without a doubt, cloud computing is going to be the big discussion in corporate computing for the coming times.

Video Games: The Battle of the Cloud

Video Games: The Battle of the Cloud
Video Games: The Battle of the Cloud

Videogaming sceneVeo on Slashdot, "OnLive and Gaikai - How to Stop Gaming Revolution", a mention of the interesting battle that could begin to register in the video game industry with the development of platforms like OnLive or Gaikai that use the principle of the cloud Computing, in a scene completely dominated by three very strong competitors whose strategy revolves precisely around the sale of their own hardware platforms: Nintendo, Sony and Microsoft.

The stage has been rated "the new two" versus "the big three", and picks up all the elements for an interesting battle: although OnLive has a small hardware development in the form of the so-called "microconsola", which comes with its own Controller, its true approach, like that of Gaikai, is to allow the use of sophisticated games on any type of platform, such as a PC or Mac connected to the network via an ADSL line. OnLive recommends a minimum of 1.5 Mbps, which are converted to 5 Mbps to play on HDTV, while Gaikai uses an adaptive platform developed over Flash that chooses the best resolution at any given time according to the conditions of the transmission to provide an experience Without cuts, and aims to position itself, according to its founder, as a service that companies in the industry could use to put games in demo or complete mode available to their customers even if they did not have the right platform, thus increasing the size of the potential market .

For the big three, the battle is extremely important because it largely defines the scenario of the future: for those who have already developed and widely accepted a proprietary hardware platform, we are talking about a scenario that provides much more control and entry barriers That another in which anyone could put their game within reach of the market by simply offering it through platforms on the net. The logical defense is to increase the development and sophistication of proprietary platforms, using peripherals such as cameras and accessories of all kinds (pistols, golf clubs or sensors of all kinds, such as Wii Fit) that in the case of new platforms would be With a much greater difficulty of introduction - the traditional distribution channel and the fear of consumers to bet on something little known play an important role here - and with a less faithful or directly impossible experience in case of having to be carried out by means of a Simple keyboard and mouse. The strategy would be completed with exclusivity agreements that would restrict the offer of titles through these channels, thus forcing developers to choose between platforms with a fully consolidated park of devices and other emerging ones that propose the use of standard devices.

At the moment, little can be known: both platforms are still in closed beta, so it is still too early to check the reception of the same in segments as the heavy gamers, who often have a great influence on development And often prefer platforms designed specifically to maximize the gaming experience versus commonly used hardware.

lunes, 3 de abril de 2017

Amazon opens the cloud

Amazon opens the cloud
Amazon opens the cloud
Amazon opens the cloud

At the time, the launch of Amazon Web Services slowly led many companies to experiment with the service: a system designed to offer low barriers of entry in both the technological and the economic found that many technology managers Practically started playing. Then, with full applications and services, and in the majority of cases, following a dynamics of upward use, despite the policy of limited margin of the system determines a truly stunting support policies. The service was launched in July 2002, probably at a time when the idea and philosophy of cloud computing was still far from being at the head of technology professionals and much more of management, but in September 2009 already showed Some data of use and growth worthy of mention. In practice, Amazon Web Services has become an evangelizing service, capable of making many companies think of it as a more than reasonable alternative: it can lower your costs, offer you flexibility, reduce your stress level, or In some lucky cases all those things at once, but it is certainly a remarkably successful service.

With the same philosophy, Amazon now launches its offer for the end user market, at a time of much more mental preparation for the subject and in a market usually characterized by much more viral growth dynamics. It also has two services: on the one hand, storage, on the other, music. Both are easy to understand, and with enormous potential: the storage service is called Amazon Cloud Drive, and offers five free gigs to keep what you want, with a price table as easy to understand as "one dollar per Giga and year "from there. Life insurance for anyone who wants to have those important files in a bomb-proof site (much more secure than your personal computer or even your corporate server) that can be accessed from any machine, at any time. Without a doubt, this service will make many users rethink their dynamics of use: have photos of the family, important files or so many other things on the hard drive of a computer that can be damaged tomorrow, which can fall A cup of coffee on top of it or you can steal it at any time by flipping it down the street is a way of waiting to have a problem: the question is not which, but when.

The other service, Amazon Cloud Player, is destined to exert an element of attraction: buy or upload your music - any music, Amazon will not ask you about its origin, but only for its format and because it is DRM free - and download it Or play it from anywhere, anytime, from a wide range of devices. At the moment, the MP3 store is accessible only if you are an American customer (or you have a proxy :-) but the use model seems clear and simple enough to attract an interesting number of customers, and with the same scheme Of prices than the Cloud Drive: five free gigs, one dollar per gigabyte and year thereafter (at predetermined intervals of 20, 50, 100, 200, 500, 1000). No cumbersome licenses or questions of any kind: your music, for your personal use.

This proposal of Amazon may mean the definitive popularization of the cloud concept for an end user who, in many cases, already used it for a wide variety of services, but possibly had not considered it as such. And from the popularization at the user level, there are schemes of use that in many cases extend to the company, especially in certain segments such as SMEs, which still pose such basic questions as "is this safe?" Or " Are not my data secure on my company's computer? "Whose answer we have been saying for years, though apparently counter-intuitive to anyone who is accustomed to thinking in terms of physical and non-virtual security, is simply NO.