Noticias de Tecnologia en Ingles

Gartner Surveys Confirm Customer Experience Is the New Battlefield

25 Oct 2014 03:27:24 Z

Every year, my colleague Laura McLellan looks into marketers? pockets. How they spend their money, she?s likely to tell you, reveals what?s on their minds. Here, truth isn?t subject to the distortions of interpretation, politics or spin that, as industry analysts, we occasionally negotiate.

Here, what you see is what you get?and what you get is a clear picture of what really matters.

Laura?s soon-to-be-released findings from the 2015 Marketing Spending Survey follow the marketer?s money trail. These findings tell an interesting story. Here?s what caught my eye:

  • Marketing budgets themselves account for, on average, a healthy 10.2% of revenue and forecasts for spending next year are stronger still?up over 10% for those planning increases.
  • Marketers are increasingly taking on bona fide P&L responsibility, particularly as business models and revenue streams go digital. Fully half of marketers report that they either have P&L responsibility today or expect to have it in the next 24 months.
  • Customer experience is the most pressing mandate for marketers, the top area of marketing technology investment in 2014, and it will lead innovation spending for 2015.

This last part, in particular, should cause you to sit up straight. Why? Because, as competition and buyer empowerment compounds, customer experience itself is proving to be the only truly durable competitive advantage. A recent Gartner survey (available to Gartner for Marketing Leaders clients here) on the role of marketing in customer experience found that, by 2016, 89% of companies expect to compete mostly on the basis of customer experience, versus 36% four years ago.

According to the same Gartner research, fewer than half of companies see their customer experience capabilities as superior to their peers?but two-thirds expect these capabilities to be industry leading or much more successful than their peers within five years.

Wishful thinking? Perhaps. But what it does reveal is a punctuated shift in emphasis and spending as marketers recognize that customer experience is fast becoming the new battlefield.

In fact, one of the top 10 Gartner predictions this year puts a fine point on that assertion:

By 2017, 50% of consumer product investments will be redirected to customer experience innovations.

Maybe Mercedes Benz USA President and CEO Steve Cannon put it best. ?Customer experience,? he says in Loyalty360, ?is the new marketing.? I couldn’t have said it better myself.

Compuware Perform 2014 Conference

25 Oct 2014 03:27:24 Z

I was invited and able to attend Compuware Perform in Orlando the week of November 6th for a couple days before heading off to Europe for 10 days of vacation, hence this blog post was slightly delayed (not to mention the Magic Quadrant and other research which is imminent now). The conference was great to attend as Dynatrace re-introduced the world to the brand, what old is once again new. The products have been renamed as follows:




John Van Siclen who was previously the CEO of dynaTrace (acquired by Compuware in 2011) was the general manager for the Compuware APM business. He is now the General Manager for the Dynatrace company. The expectations are that this will be a standalone company when the privatization of Compuware closes in the fourth quarter of this year (more on that later). John’s keynote had some interesting points, Compuware has an impressive 89.9 net promoter score, and an online community of 84,000 people.

Dynatrace launched 4 key targets and messages for the solutions : Launch readiness, User analytics, Performance Engineering, and Production Monitoring

Dynatrace plugins will be open (as other APM companies have done) and they will be leveraging github :

Innovation will flow from Ruxit to Dynatrace including the network probe agents and the UI. I’ve already posted my thoughts and time with Ruxit, which were positive.

Dynatrace 6 was unveiled which included agent support for technologies such as NGnix, IBM IMS. There were improvements for TIBCO ActiveMatrix, TIBCO EMS, Java 8, iOS 8, NGINX, HBase, Cassandra, MongoDB and iPlanet. Dynatrace showed the new Web UI, which provides some high level views outside of the thick java based client they use today. I wasn’t too impressed with the new dashboard and UI, but it was an early-state prototype I wouldn’t have expected to see in a public forum. I am sure that will improve considerably before it ships next year.

Some nice improvements in the synthetic monitoring, including a free web test for lead generation (I still believe what you get for free with webpagetest is far more complete).  Investment in synthetic monitoring seems too heavy, this has been an issue with Compuware for quite some time.

I spent time in a couple sessions on DCRUM, and the product are still very network buyer and legacy application focused. I wasn’t too impressed with what I saw. I would have liked to have seen a more innovative approach to solving these problems, similar to the agent present in Ruxit, which is how we see the future of network analysis being done.

The one large piece missing from my discussions and the messaging was analytics, while Dynatrace has lots of great analytics technologies within the products for determining root cause, understanding performance deviations, and finding the major issues within applications, the broader analytics and ITOA strategies many are strategically invested in is something which Dynatrace’s messaging consists of integrations with providers like Geckoboard, Splunk, and others. This is clearly not what APM buyers are looking for today.

On the non-product side many of you are aware that private equity (PE) firm Thoma Bravo, who have an extensive background in the monitoring and management space (Network Instruments, Keynote, Infovista, and others) have decided to purchase Compuware and divide the company into several parts to drive growth. While many PE firms operate in other ways Thoma Bravo is a unique firm. One of the VPs Chip Virnig spoke a little about how  APM is a great market to place bets (which I agree with) –

Some recommended reading about PE firms (sorry clients only) : How to Re-evaluate Strategic Vendors Acquired by Private Equity

I spoke to several customers, and some were using end to end mobile APM. Interesting use cases, and capabilities within the offering which have matured nicely. Expect new research and presentations on mobile APM in the next 6 weeks at our upcoming Gartner Data Center Conference. Dynatrace 6 offers major improvements on scalability in terms of how many controllers are needed. Customers confirmed they needed far fewer controllers and management servers in the newer products, which is good to hear from end users. There is more planned in terms of scale and capabilities.

Only being able to attend one day of three, I got a lot of value out of this conference.

Disclosure : Compuware paid for travel and hotel for my attendance at this conference

Is Social Marketing Actually a Grand Illusion?

25 Oct 2014 03:27:24 Z

It’s time to get real about measuring social marketing. I’m not talking about ads on Facebook or, ahem, Snapchat — that’s just advertising. I’m talking about that thing we’re supposed to be doing around the clock 24/7: creating great content that reflects our values and sharing it on our social channels so whoever’s there will pass it along and we will . . . well, what happens next is never entirely clear, but it’s got to be something. Right?

Let’s come right out and ask it: Does social marketing have any real impact on any business metric?

That’s business metric. Not social metric. Using fan counts or retweets as a measure of impact is kind of like asking the Cake Boss how your diet is going. No, I’m talking about connecting all the stuff we’re doing to influence social chatter to the things that ultimately matter: revenue, market share, profitability, stock price, market cap.

Good news: Rather than offering mere opinion, here’s actual data. For six months ending last May, 2014, Gartner used a social listening tool called Tracx to collect and analyze public conversations about a select list of companies, on popular social channels such as Facebook and Twitter and blogs. The targets we chose were leading digital marketing or ad tech related software companies or digital marketing agencies. There were 28 of them. This is what we found:

There was no observable correlation between social marketing volume and business success for digital marketing software and services companies.

You heard me. Before we start parsing out explanations — or you tell me the methodology is flawed — let me take you through what we saw. Also note that this was not a peer reviewed experiment and is not official published Gartner research. It is a conversation starter.

Sample: The 28 companies in our sample included both large brands, such as IBM and Adobe, and smaller ones such as Datalogix, DataXu and Turn. These are companies that come up the most in inquiries with our clients. We also included a few prominent agencies such as Razorfish and Xaxis.

Filters: In setting up our monitors, we tried to filter out irrelevant conversation (to the extent possible) using negative matches, and to include only conversations relevant to digital marketing. For example, we were only interested in following Google Analytics, not Google’s self-driving cars, and IBM’s digital marketing platform, not its hardware or consulting businesses. Some brands such as Turn were very difficult to handle, but we did our best.

Channels: We looked at public conversations on the brand’s own pages, as well as conversations happening elsewhere. We included news sites, Twitter, blogs, Facebook, Google+, YouTube, Tumblr, Instagram, LinkedIn and others.


(1) Some brands make a lot more noise than others

Our sample could be divided into three types of brands: noisy, average and quiet. Like so (showing total posts captured):


Obviously, Google is a hot topic but it’s always an outlier. So is comScore because it comes up in conversations that cite it as a source for media metrics. On the other hand, HubSpot is an acknowledged master of social marketing — that’s what it does, after all, enabling brands to market themselves, — so it’s clearly doing something right. Among the other players, Marketo (a public company), Sitecore (CMS) and Optimizely (an optimization tool) are making some noise. On the other hand, our friends at Turn and Rocket Fuel (pre-brouhaha) and SAS are more subdued.

But does any of this matter? Well, it’s hard to see how. Everyone loves HubSpot, and it recently had a successful IPO, but is it really a better company than Turn? Better at social marketing, certainly. It’s market cap is about $1 billion. On the other hand, Marketo is only making about 20% as much social noise, but is valued at $1.6B. Could HubSpot hold back a little and see its value rise?

And what about our wallflowers: is there comparative modesty hurting their bottom lines? [x+1], a well-regarded DMP/DSP, was acquired by Rocket Fuel recently. Certainly, its social marketing was softer than its product. Would a better program have got it a higher exit price? Maybe. But what about Turn or SAS? Turn raised $80M simoleons in January and SAS just built a new building in North Carolina. They’re making barely a peep in the social space. Are their investors and customers just ignorant people?

No, I’ll be honest and say I can’t find any obvious correlation between social marketing volume and product quality, company value, profitability, or anything really except — well — social marketing volume.

(2) Social Measurement Is Really Twitter Measurement

Part of the problem here is that social measurement itself is flawed. We don’t and can’t get a true picture of the entire “social conversation” using the available tools. Much of the social net is dark matter: IMs and emails (if you count them) are invisible; as are non-public conversations and much non-text communications like pictures. The reliance on public sources has the effect of making Twitter seem a lot more important than it is.

Here’s the channel skew of our sample:


Well over half the “conversations” happening about our companies were tweets. You’ll find this in most social reports — they are overwhelmingly reports about what’s happening on Twitter. Why? Twitter is almost 100% public and available to social monitoring tools. It encourages short bursts of outrage. Facebook is mostly non-public; remember, social monitoring tools are not in your “friends” category. They are strangers to every party. Brand pages are often public, but I suspect most real unfiltered product-related commentary does not happen on brand pages. (Blogs are probably overrepresented here because our companies — digital marketing software and services players — are the subject of a disproportionate load of blog ink.)

So I’m admitting our measurement methodology is flawed — badly flawed. But it’s not our fault. A more complete picture might show us stronger patterns, where smart social marketers dominate a productive dialogue with engaged consumers. But we don’t get to hear that conversation. Nobody hears all of it. Not even the NSA.

Although I’ve blogged previously that long posts are more likely to be successful than short posts (check me here, bros), I’m pushing it. Consider this Part 1 of 2 (or even 3). In the next part, I’ll continue revealing the results of our little experiment, including these amazing topics:

  • What is “buzz” worth? Comparing HubSpot and Marketo
  • Does self-promotion actually work?
  • Is it more important for companies to have a good product or a good social marketing program?

And so much more . . . see you next week.

We Are All Technologists Now

25 Oct 2014 03:27:24 Z

Many of us do it, but we shouldn’t.  The ‘it’ here is making the statement “I’m not a technologist…”    It’s always followed by a qualifier, such as “…so I cannot really answer your question.  I have to check with one of our archictects/developers/managers.”  The latter part of the response is fine; it’s often appropriate and necessary.    

I have problem with the first part of the response.  If you work in the technology industry, you are a technologist.  Every person who pitches, markets, sells, supports, implements, and services the technology has an obligation to understand their technology as deeply as they possibly can.  

When our customers and prospects take a meeting with us, they are spending their valuable time in order to accelerate their knowledge gathering curve.  When we don’t know enough about our products and how they work at the conceptual, design, or architectural level, there is a lost opportunity costs.  Clients and prospects are their most receptive to knowledge at the moment we are talking to them. 

The more you can share about your technology’s technology, the more you answer your clients’ or prospects’ number one concern:  why should I do business with you? 

P.S.  Forgive me for making a pun about President’s Nixon’s view of economics in my blog’s title.

Why does the MDM (aka master data) hub have to capture the state of data?

25 Oct 2014 03:27:24 Z

I was enjoying an invigorating inquiry with and end-user client yesterday.  The question on the table was this: why does the MDM hub, the place where the single source of truth (for master data) resides, need to be the place where we recognize, capture and govern the state of data?  I felt this was an intriguing question.  It immediately exposes for me one of the weaknesses of our collective information infrastructure and application landscapes; and also highlights an apparent fatal flaw in our current information systems design.

First the weakness: Even with all the talk of standards, protocols, and metadata, our information systems and applications deployed around our organizations do not, on the whole, get on well with each other.  I don?t mean to say they do not or cannot be integrated.  Many of our systems are integrated; that is very different to exploring the degree to which each system understands or respects the (information) needs of others (i.e. context).  In fact, by design, we have compartmentalized context because that is how we do things.  Using systems thinking we break things down into their smallest common components and assemble them into logical or meaningful groups.  Thus the concept of application or app is explained is somewhat explained.

This will expose the flaw.  A business application is meant to be a representation of how work should get done.  Years ago these were monolithic beasts; now they are almost transient services assembled periodically, perhaps one day, on the fly.  However, the very notion that we can predefine how work gets done leads to the idea that there is a boundary, and a boundary results in a semantic model that meets the needs of that which is bound.  Thus we have many applications, and each with their own authority model over their own semantic landscape.  We designed systems efficiently so that they did not have to worry or concern themselves with other application needs for the same information shared or ?integrated? between them.

We are not going to solve this problem quickly, if at all.  The fact that I said this was a “problem” is itself problematic; it may not be a real problem that needs to “solved”, whatever “solved” means.

Maybe we don?t need to solve the entire problem.  In fact, Master Data Management (MDM) is a great case in point.  One that should be brainstormed at the highest echelons of IT strategy.  I say this, because MDM is at once a Trojan horse, and at the same time, a silver bullet.  It is a Trojan horse because it is showing us that we DON?T actually need to govern, master, properly manage all the data in our business systems.  We don?t need to because people are generally quite good at doing things, and coping with adversity.  An effective MDM program helps an organization prioritize and effectively manage only what matters, and govern what gives back.  Much data is fine, as is, in whatever state it finds itself.  If we only govern the part of the information make-up that really, truly matters, good things will come, and the rest of the world will operate quite nicely, thank you very much.  Another way to look at it might be this: we only need to exploit our master data slightly better than our nearest competitor…

MDM is also a silver bullet:

  • It sounds so easy
  • It’s not even that new as an idea (didn’t an Enterprise Data Warehouse do this anyway?)
  • It?s a technology -can?t we just buy one?
  • We can?t afford a massive cost , so is there an easy way to reconcile and manage this data, on the cheap?

MDM is not a technology.  MDM is not easy.  MDM is not cheap ? but I don?t mean to say that it must cost you a $1M to acquire.  I mean to say the mental energy (executive and non-executive) needed to understand what MDM is, and how to make it work for the organization, is not insignificant.  The money you spend on ?it? is the least of your worries.  Truly understanding what makes MDM different to ERP, EDW, data integration, SOA, cloud, in-memory, is the real concern.

So back to the question: Yes, the MDM hub needs to play the role of the source of information state since the rest of your information infrastructure and application landscape is incapable of doing so for the benefit and information of others.  This is a most interesting idea ? one that sounds pretty dry, even IT?ish, but from a business perspective (note I have never worked in IT!) it is really quite interesting.  One I wish more of us explored – ideally with a nice, cool libation in hand.





25 Oct 2014 03:27:24 Z

Is 15 minutes a mere instant or an eternity? Is getting an alert 15 minutes after it was first generated fast enough? And the opposite question: is 15 minutes of MSSP-side alert triage enough to make sure that the alert is relevant, high-priority and high-fidelity? Indeed, spending too little time leads to poor quality alerts, but spending too much time on quality alerts leads to the attacker achieving their goals before the alert arrives and is acted upon.

So, yes, I did speak with one MSSP client who said that ?15 minutes is too late for us? and another who said that ?an MSSP cannot do a good job qualifying an alert in a mere 15 minutes? (both quotes fictional, but both ?inspired by a real story?).

The answer to this ? frankly not overly puzzling ? question is again security operations maturity. On one end of the spectrum we have folks who just ?don?t do detection? and rely on luck, law enforcement and unrelated third parties for detection (see this for reference). On the other, we have those with ever-vigilant analysts, solid threat intel and hunting activities for discovering the attackers? traces before the alerts even come in.

As we learned before, security chasm is very strong in this area.

Therefore, a meaningful MSSP SLA discussion cannot happen without the context of your state of security operations.

For example, if you ?

  1. … have no operation to speak of and plan to hire an intern to delete alerts? You can accept any alert SLA, [SAVE MONEY!!! GET YOUR ALERTS BY SNAIL MAIL! CARRIER PIGEON OK TOO! :-)] whether it is at the end of the day, or even a week. If you have no plan to ever act on a signal, a discussion of the timing of action is senseless.
  2. … can act on alerts when really needed, and will probably scramble a response if something significant happens? Look for a few hours or similar timing, and limit alerts to truly critical, ?incident-ready? ones.
  3. … have a defined security monitoring/response function that is equipped to handle alerts fast? Aim at up to an hour for significant alerts and others maybe at the end of the day.
  4. … possess a cutting-edge security response operation? Push your MSSP partner to 15 minutes or less ? for the best chance to stop the attacker in his tracks. Set up a process to review and process alerts as they come, and refine the rules on the fly. Respond, rinse, repeat, WIN!

The key message is: you don?t want to pay for speed that you won?t be able [or don?t plan] to benefit from. If security alerts will sit in inboxes for hours, you don?t need them delivered in minutes.

Now, what about the SLAs for various management services, such as changing NIDS rules and managing firewalls? SLAs play a role here as well, and ? you guessed it ? what you need here also depends on the maturity of your change management processes? Some people complain that an MSSP is too slow with updates to their security devices, while others know that MSSP does it faster than they can ever do it.

Blog posts related to this research on MSSP usage:

Big Data Is Entering the Age of Aquarius

25 Oct 2014 03:27:24 Z

Suddenly, I realized: fluids are in, animals are out. The big data ecosystem has given up on its elephants, impalas and pigs in favor of aquatics.  Perhaps, the shift started with “data lakes,” or, perhaps, data lakes just reflected the state of big data (pun intended). Or maybe, Cascading was the one that signified the shift: Cascading was the first to enable data application development on Apache Hadoop, leaving developers Driven, Lingual and Scalding. It is now obviously Fluid.

According to Metanautix, navigating data has never been so fluid. Did you notice not just nautix, but also meta in the name? This is for a good reason. Metadata is key to deriving value from big data. Tonight (23 October), I am moderating a panel on enabling Hadoop data, so that a data lake wouldn?t turn into a data marsh. Incidentally, one of the panelists is Waterline Data Science. If you are in the Bay Area, come see the panel in person in Sunnyvale ? everyone is welcome.

Big data is flowing through all kinds of data pipelines and is streaming from all things, up to the point of becoming a DataTorrent. It can run freely in its purest state ? H2O, or get sublimated into Snowflake in the cloud. If weather permits, data could even pour like FirstRain (a company that trademarked the term Personal Business Analytics!).

You might wonder whether the current big data darling Spark fits into the Age of Aquarius picture. It does ? how about a killer app Sparkling Water? The new wave of big data technologies is rising!



Follow Svetlana on Twitter @Sve_Sic

Debt Crowdfunding Holds Much Promise

25 Oct 2014 03:27:24 Z

Conversely to my recent post that equity crowdfunding doesn’t exist, debt crowdfunding I believe is almost a foregone conclusion. This is also ironic since only about 4% of the 50 sites I looked at were debt crowdfunding sites (as opposed to 18% for equity crowdfunding). Prosper and Lending Club are two of the more prominent debt crowdfunding sites (at least in the US). Also ironic is that equity crowdfunding gets much more positive press than debt. They say “Invest in the next Facebook, Google or Twitter” like you are simply a click away from making millions. This is highly unrealistic. However, getting a 6% to 8% return by “peer lending” money is very realistic, albeit quite a bit less exciting. Here are my arguments as to why debt crowdfunding has promise. I’ll use the same structure I used in my analysis of equity crowdfunding.

They are more inclusive.

The debt sites I examined are also currently limited to Accredited Investors. This is to be expected since it is the law (in the US at least). However, they are positioned well to adapt to Title III of the JOBS Act (crowdfunding) when the SEC does deliver approved rules. It seems that they always intended to adapt. The whole process from signing up to investing to tracking all could remain the same. The only adjustment would be no need to qualify oneself as an accredited investor when signing up. I can foresee a boom in participation once the crowds are allowed to invest and made aware. In fact, I will go on the record as saying it is the supply side that will limit growth. The crowdfunding sites will struggle to keep a satisfying flow of loans available to the crowds. If anything, this will cripple a debt site and cause an exodus of the crowd.

They are inexpensive to investors.

Investment minimums for debt crowdfunding are as low as $25.00. And service fees are usually less than 1% of the return. These are numbers that the crowd can handle. I believe that the costs to get a loan are still a bit high but as debt crowdfunding grows and competition for loans increases we will most likely see origination fees decline. Now the $25.00 minimum and the 2000 investor limit currently restricts the amount of a loan to a maximum of $50,000. But when Title III hits the limit will be constrained by the $1M annual limit vs. the number of investors. For peer lending to your average citizen this limit isn’t much of a limit. But it also provides plenty of room for loans to small businesses. So I see the opportunity for debt crowdfunding to expand or grow to new types of loans.

Last week I examined 1300 peer-lending loans and here were some findings. There were two major reasons for the loans. 58% of the loans were loan refinancing or debt consolidation and 28% were for credit card payoff. The next most frequent reason was home improvement at a distant 4.36%. Small business loans were 1% and home purchase was .5%. So right now debt crowdfunding is primarily an alternative to high interest credit cards. Over time I would expect to see this change as debt crowdfunding becomes more main-stream. I would expect, in particular, to see business loans, home improvement, and major purchases grow as a larger percentage. However, I believe credit payoff/consolidation to dominate for the next 3-5 years. Other interesting numbers include 40% of the loans were from people with over a 700 FICO credit score. 65% of the loans had a 36 month term and 35% had a 60 month term. 3% of the loans had over a 20% interest rate. 61% had a rate between 10% and 20% and 35% had a rate below 10%. The lowest interest rate was 6%. This may not be the same as holding shares of the next Facebook but it is pretty compelling when compared to savings and CD interest rates.

They are transparent (enough).

The sites provide basic information on the loan such as the reason, the term, the rate, the amount, credit score, and some site risk score. There are often additional details such as the location (State) of the loan, length of employment and verified income. This should be enough information for due diligence on a $25.00 investment. Most of the risk management comes from the risk balance of a portfolio of loan notes. A non-accredited crowdinvestor who has $2000.00 to invest can assemble a portfolio of 80 notes that together comprise an acceptable level of risk. Lending Club, for one, provides information on a selected portfolio including anticipated default rate before you execute on buying the set of notes.

They are direct.

With a few clicks you directly invest in the notes. You don’t have to request more materials or apply for the opportunity to invest. You don’t invest in a security that assembles a set of notes by multiple tranches (sound familiar). You invest in an individual loan note. Within minutes you can assemble a portfolio of 100s of notes. Now it can take a week or two for those notes to close and a subset of them won’t close so it can take several tries and a several weeks to invest all your money. Then about 30 days after your first note closes you will start to see returns accrue. And this, of course, depends on the number of loans available for investment. If you want to invest $50,000, $25 at a time, in notes that have a 36 month term, a credit score over 750 and are originated in Texas then it can take you a long time to get that money invested. But if you are less restrictive and put $25 across a variety of notes then it will go much more quickly. With debt crowdfunding where risk is spread across many loans, investment risk management shifts from the individual investment to the portfolio.

All of these factors combine to make debt crowdfunding more appropriate for the masses than equity crowdfunding. Although equity might get more crowd-friendly, I believe the near-term and mid-term promise for crowdfunding securities lies with debt.

As always, I’m interested in questions and opposing or supporting positions. If anyone knows of great debt crowdfunding sites then please let me know and I’ll look at them.
Soon I’ll move on to reward based crowdfunding.

Magic Quadrant for Application Delivery Controllers (ADC)

25 Oct 2014 03:27:24 Z

We just published the 2014 Magic Quadrant for Application Delivery Controllers (ADC)  (Gartner subscribers only). The Magic Quadrant includes analysis of ten vendors in the ADC market. In going thru the research process, here are a couple things of note?

Changing Landscape

In comparison to the 2013 ADC Magic Quadrant, four of eleven (36%) vendors changed positions. The resulting ten vendors in the 2014 ADC Magic Quadrant include:

  • A10 Networks
  • Array Networks
  • Barracuda Networks
  • Citrix Systems
  • F5 Networks
  • KEMP Technologies
  • PIOlink
  • Sangfor
  • Radware
  • Riverbed

Three of these vendors (A10, Barracuda, PIOlink) underwent initial public offerings, raising over $300M USD. This is a substantial injection of capital into the $1.6B ADC market which should provide additional financial flexibility and security ultimately could/should drive further R&D investment leading to innovation in the space.

Buying Profiles

Generally, we see three types of ADC buyers: basic, extended and advanced.  Basic buyers are looking for a load-balancer, no more no less. These buyers don?t want, don?t need or don?t know about the more advanced ADC capabilities. However, most of the folks we speak to are extended buyers who are looking to leverage several of the more advanced features such as WAF, GLB, programmatic scripting, FEO etc.  The most advanced buyers are looking for the advanced features, AND delivered in resource pool that is dynamically integrated with orchestration systems or CMPs like VMware/OpenStack or as part of an SDN service-chain.

To the Cloud

ADCs sit in front of application servers, thus it is no surprise that as workloads move to the cloud, the ADC vendors are integrating their products into the cloud eco-systems. Thus, over the past year, significant progress has been made with ADC vendors integrating their products tightly to work with CMPs (i.e., VMware, Microsoft and OpenStack – LBaaS plugins for all) as well as within public cloud providers like AWS, Azure, Rackspace, Softlayer, vCloud Air etc?

And Security matters too?

Over the last 12 months, we?ve seen a number of security issues including Heartbleed, weekly breaches in retail, concerns over governmental spying and proliferating DDOS attacks. These all underscore the need for defense-in-depth which includes application security that can be derived from an ADC.

This is just a snippet from the research, and here are some other tidbits, but you can access the full Magic Quadrant here:

Magic Quadrant for Application Delivery Controllers (Analyst(s): Mark Fabbi | Andrew Lerner)

Summary: The application delivery controller is a key component within enterprise data center and public cloud architectures. Network, security and application personnel should evaluate ADCs based on how they integrate with key applications and cloud/virtualization platforms.

Regards, Andrew


Gartner Forecasts Triple-Digit Growth in 3D Printer Shipments

25 Oct 2014 03:27:24 Z

3D printer sales to exceed $13.4 billion in 2018 with 2.3 million units shipped.

Our annual forecast incorporates all 3D printers across all of the current 3D printing technologies. Globally, we expect shipments will grow at a compound annual growth rate of 106.6% and revenue will climb at a CAGR of 87.7% through 2018.

We also project the number of printer shipments and revenue across ten regions worldwide. Greater China’s 120.6% CAGR shipment growth will outstrip all other regions through 2018.

The 3D printer market is clearly at an inflection point. Unit shipment growth rates for 3D printers, which languished in the low single and double digits per year since the early days of additive manufacturing are poised to increase dramatically beginning in 2015. As radical as the forecast numbers may seem, bear in mind that even the 2.4 million shipments that we forecast will be sold in 2018 are a small fraction of the total addressable market of consumers, businesses and government organizations (including the military) worldwide.

At the top level, the 3D printer market today has two main branches. While there is some overlap, as there is in any industry that offers products for both markets, consumer buyers are very different than enterprise buyers. Understanding that fact is one of the critical keys to understanding the 3D printer market.

The primary market drivers for “consumer” 3D printers (typically under $1,000) are lower prices, improved performance and expanded global availability. The primary “enterprise” 3D printer market drivers are the viability of 3D printing technologies for prototyping and manufacturing coupled with lower 3D printer costs, improved quality and a wider range of materials.

Gartner has been covering the 3D printer market as it emerged into a viable technology for a wide range of users over the last seven years. After more than two decades of research and development, as well as numerous applications in specialist industries, we find that the technology has finally achieved manufacturing readiness in several industries and can augment manufacturing processes in many more.

A second critical key to understanding the market is the fact that, today, there are seven 3D printing technologies, each with its own set of capabilities and constraints. As new technology providers and technologies have emerged and barriers such as quality and size limitations are being overcome, 3D printing has reached an inflection point with practical, viable applications in a wide range of vertical markets ? including yours.

Gartner?s Forecast: 3D Printers, Worldwide, 2014, written with my co-author Zalak Shah and the assistance of analysts worldwide, is available here. The 42 page report includes forecast data, methodology and assumptions as well as our analysis of what the forecast signifies for technology providers.


Subscribe to my blog posts ? Simply click on the RSS button at the top right

Noticias de Tecnologia

ENTER al día: Se dobla el iPad Air 2, Ultron y Harry Potter regresa

25 Oct 2014 03:27:34 Z

Bienvenidos a ENTER al día, estas son las noticias del día más importantes en el mundo de la tecnología y la cultura digital.Continúa leyendo en ENTER.CODeja un comentario en ENTER al día: Se dobla el iPad Air 2, Ultron y Harry Potter regresa, 2014 ENTER.CO

El día en que la marca Nokia dejó de ser de smartphones

25 Oct 2014 03:27:34 Z

Desde la compra de Nokia por parte de Microsoft, estábamos esperando este momento: el día en que la marca Nokia dejará de aparecer en los Lumia. Recientemente el gigante de Redmont cambió la marca a Microsoft Mobile en la página de Nokia. Sin embargo, a través de una publicación oficial de la compañía en la […]

Conozca las primeras imágenes de la serie animada de ?Spawn?

25 Oct 2014 03:27:34 Z

El personaje que impulsó a la editorial Image Comics en los años 90 tendrá otra oportunidad fuera de las páginas de las historietas, según informa Comic Book Resources. ?Spawn? fue una creación de Todd McFarlane y uno de los títulos que arrancaron con la fundación de la editorial. Su estilo gráfico ?gracias al dibujante Greg […]

Will Smith producirá una serie basada en ?Hitch?

25 Oct 2014 03:27:34 Z

Will Smith es recordado por muchos personajes: Will en ?The Fresh Prince of Bel-Air?, el agente J en ?Men in Black? o como ?Hitch?, en donde interpretaba al mejor asesor en cuestiones amorosas.Continúa leyendo en ENTER.CODeja un comentario en Will Smith producirá una serie basada en ?Hitch?, 2014 ENTER.CO

?Six Guns? trae lo sobrenatural al viernes de streaming

25 Oct 2014 03:27:34 Z

Justo a una semana de celebrar Halloween, un juego lleno de elementos sobrenaturales es el protagonista del viernes de streaming con Gameloft. Se trata de ?Six Guns?, un juego de acción en tercera persona de mundo abierto, en el cual el jugador toma el control de un hombre dispuesto a enfrentar al salvaje oeste y […]

Vean a un ejecutivo de Google romper el récord de Felix Baumgartner

25 Oct 2014 03:27:34 Z

Sin mucho bombo, el dr. Alan Eustace rompió el récord que fijó Felix Baumgartner en 2012. Eustace, un ejecutivo de Google, subió en un globo de helio hasta la estratosfera y se dejó caer. El ingeniero de sistemas, que actualmente es el vicepresidente de búsquedas en Google, se lanzó a 41,4 km de altura en […]

Al gigante Amazon le están empezando a doler las rodillas

25 Oct 2014 03:27:34 Z

Amazon, una de la empresa líderes de comercio electrónico en el mundo, publicó sus resultados financieros para el tercer trimestre de 2014. La compañía logró aumentar sus ventas 20% frente al mismo periodo de 2013; durante el Q3 2013 Amazon facturó 17.090 millones de dólares, mientras que el Q3 2014 ingresó 20.850 millones. Sin embargo, […]

ETB lanza su hosting de base de datos Oracle

25 Oct 2014 03:27:34 Z

Por: Jairo Andrés Ladino* El pasado 5 de septiembre ETB lanzó un nuevo e innovador servicio que les permite a las empresas hospedar (tercerizar) su base de datos Oracle. Este nuevo producto se denomina hosting de base de datos (o DBaaS, base de datos como servicio) y cuenta con características de virtualización, y aislamiento de tenants sobre una […]

Lo nuevo de ?Harry Potter? será protagonizado por Dolores Umbridge

25 Oct 2014 03:27:34 Z

Aunque J.K. Rowling quiere seguir explorando otros campos de la literatura con nuevas novelas, la autora de la popular saga de Harry Potter no piensa alejarse por completo del mundo mágico. TIME informa que la escritora publicará una nueva historia corta contextualizada en el mundo de Harry Potter para Halloween.Continúa leyendo en ENTER.CODeja un comentario en […]

Beap Pets, un dispositivo que te ayuda a localizar a tu mascota

25 Oct 2014 03:27:34 Z

Quienes tienen mascota saben lo importante que es estar atentos a la hora de salir con ellos a dar un paseo o la confianza que implica dejárselos a alguien para que los saque a pasear y estar seguros de que van a volver a casa. Además, muchos han tenido que pasar por la terrible y […]