New York?s Javits Center is a cavernous triumph of form over function. Giant empty spaces were everywhere at this year?s empty-though-sold-out Strata/Hadoop World, but the strangely-numbered, hard to find, typically inadequately-sized rooms were packed. Some redesign will be needed next year, because the event was huge in impact and demand will only grow. A few of those big tent pavilions you see at Oracle Open World or Dreamforce would drop into the giant halls without a trace ? I?d expect to see some next year to make some usable space available.
So much happened, I?ll post a couple of pieces here. Last year?s news was all about promises: Hadoop 2.0 brought the promise of YARN enabling new kinds of processing, and there was promise in the multiple emerging SQL-on-HDFS plays. The Hadoop community was clearly ready to crown a new hype king for 2014.
This year, all that noise had jumped the Spark.
If you have not kept up, Apache Spark bids to replace supplement MapReduce with a more general purpose engine, combining interactive processing and streaming along with MapReduce-like batch capabilities, leveraging YARN to enable a new, much broader set of use cases. (See Nick Heudecker?s blog for a recent assessment.) It has a commercializer in Databricks, which has shown great skill in assembling an ecosystem of support from a set of partners who are enabling it to work with multiple key Hadoop stack projects at an accelerating pace. That momentum was reflected in the rash of announcements at Hadoop World, across categories from Analytics to Wrangling (couldn?t come up with a Z.) There were more than I?ll list here ? their vendors are welcome to add themselves via comments, and I?ll curate this post for a while to put them in.
Hadoop analytics pioneer Platfora announced its version 4.0 with enhanced visualizations, geo-analytics capabilities and collaboration features, and revealed it has ?plans for integration? with Spark.
Tableau was a little more ready, delivering a beta version of its Spark Connector, claiming its in-memory offering delivered up to 100x the performance of Hadoop MapReduce. Tableau is also broadening its ecosystem reach, adding a beta version of its connector for Amazon EMR, and support for IBM BigSQL and MarkLogic.
Tresata extended the analytics wave to analytic applications, enhancing its customer intelligence management software for financial data by adding real-time execution of analytical processes using Spark. Tresata is an early mover, and believes one of its core advantages derives from having been architected to run entirely in Hadoop early on. It supports its own data wrangling with Automated Data Ontology Discovery and entity resolution ? cleaning, de-duping, and parsing data.
(For developers, Tresata is also open sourcing Scalding-on-Spark ? a library that adds support for Cascading Taps, Scalding Sources and the Scalding Fields API in Spark.)
Appliances were represented by Dell, who introduced a new In-memory box (one of many Hadoop appliances that represented another 2014 trend) that integrates Spark with Cloudera Enterprise. (Dell is all in on the new datastores ? they have buit architectures with Datastax for Cassandra, and with MongoDB, as well.)
Cloud was brought to the party by BlueData, packaging Spark with its EPIC? private-cloud deployment platform. Standalone Spark clusters can run Spark-Scala, MLLib or SparkSQL jobs against data stored in HDFS, NFS and other storage. Note ?standalone? ? Spark can, and will, be used by shops that are not running Hadoop. Once it is actually running production jobs, that is.
Rackspace is in both games with its OnMetal ? an appliance-based cloud you don?t have to own, with a high-performance design using 3.2 TB per data node. They provision the other services. Rackspace is partnering with Hortonworks to deliver HDP 2.1 or ? you guessed it ? Spark. This is all built on a thin virtualization layer on another emerging hot platform: Openstack.
The distributions were represented of course: Cloudera jumped in back in February accompanied by strong statements from Mike Olson that helped put it on the map. Hortonworks followed in May with a tech preview. It still is in preview ? Hortonworks, for good reasons, is not quite prepared to call it production-ready yet. Pivotal support was announced in May ? oddly, in the Databricks blog, reflecting its on-again, off-again marketing motions. In New York, MapR on the bandwagon since April as well, announced that Drill ? itself barely out of the gate ? will also run on Spark.
It was intriguing to note that many of the emerging data wrangling/munging/harmonizing/preparing/curating players started early. ClearStory CEO Sharmila Mulligan of was quick to note during her keynote appearance that her offering has been built on Spark from the outset. Paxata, another of the new players with a couple of dozen licensed customers already, has also built its in-memory, columnar, parallel enterprise platform on top of Apache Spark. It connects directly to HDFS, RDBMS, and web services like SalesForce.com and publishes to Apache Hive or Cloudera Impala. Trifacta, already onto its v2, has now officially named its language Wrangle , added native support for more complex data formats, including JSON, Avro, ORC and Parquet, and yes, is focusing on delivering scale for its data transformation through native use of both Spark and MapReduce.
Even the conference organizers got into the act. O?Reilly has made a big investment with Cloudera to make Strata a leading conference. It’s added a European conference, making Doug Cutting the new conference Chair. In New York, O’Reilly announced a partnership with Databricks for Spark developer certification, expanding the franchise before someone else jumps in.
There is far more to come from Spark ? a memory-centric file system called Tachyon that will add new capabilities above today?s disk-oriented ones; the MLlib machine learning library that will leverage Spark?s superior iterative performance, GraphX for the long awaited graph performance that today is best served by commercial vendors like Teradata Aster, and of course, Spark Streaming. But much of that is simply not demonstrably production-ready just yet ? much is still in beta. Or even alpha. We?ll be watching. For now, it?s the new hype king.
Last year, my colleague Andrew Frank and I introduced the Intelligent Brand Framework as something of a reaction to the overstated promise of big data. Our premise was that data-driven intelligence, while crucial, was far from everything?and marketers that confused it as some sort of silver bullet were bound to find themselves on the wrong side of the more human, grey-shaded realities of business.
Yesterday, in HBR, Michael Schrage issued a warning to data-driven retailers based on the lessons of Tesco?s very public fall from grace. What caught my eye, of course, was the fact that Tesco?once the very paragon of data-driven insight and multichannel digital innovation?had fallen and fallen so hard.
Why? Schrage hypothesizes:
?A harsher interpretation is that, despite its depth of data and experience, today?s Tesco simply lacks the innovation and insight chops to craft promotions, campaigns and offers that allow it to even preserve share, let alone grow it. What a damning indictment of Tesco?s people, processes and customer programs that would be. In less than a decade, the driver and determinant of Tesco?s success has devolved into an analytic albatross. Knowledge goes from power to impotence.?
Call it the big data blind spot.
It?s only a theory, of course. But it?s a reasonable hypothesis that perhaps Tesco simply put too many eggs in its data-driven basket and consequently foreclosed some of its other areas of potential intelligence. Maybe Tesco indulged in the magical thinking of big data miracles.
To use the language of the Intelligent Brand Framework, Tesco may have overinvested in its power center (appropriately, the left side of the quadrant) and failed to cultivate an adequate flex zone.
I won?t play armchair analyst on the specific causes of Tesco?s fall, for these sort of dramatic reversals often involve many actors and many circumstances working in concert toward a catastrophic chain of events that reveal themselves through weeks, months and years of forensic scrutiny.
But I will make the more general assertion that the big data blind spot is more than just a snappy turn of phrase. I believe an overreliance on data-driven insight can create the sort of false comfort and complacency that causes brands to lose sight of what they stand for and what customers really care about.
What do you think?
I have been writing extensively on the issues around multichannel pricing for a few years now. As we head into this holiday season there is a clear expectation that price competition will heat up. There are some key differentiators that will separate the winners and losers. As you face these new competitive pressures that are driven by consumerization don?t lose focus on what is important. Here is how to get a treat:
The displayed price of items must be competitively composed according to the retailer?s brand proposition and offered consistently across all channels.
Customers will give you a trick by shopping your competition if you don?t deliver on the basics.
We’re well past the Year of Mobile and into the New Society: the one where people and devices are never really parted and mobile marketing analytics is just marketing analytics with an extra “m” for, well, “mojo.”
You spend more than two hours a day staring at your phones. To be more specific, you spent about 141 minutes using your mobile device in 2013, up from just 20 minutes five years ago. That’s 121 extra minutes (2 hours, people) you somehow discovered unencumbered or stole from daydreams, reading, doing one thing at a time. Most of that time is spent gaming and drifting through Facebook, of course.
But not all. Holiday shopping on smartphones and tablets is projected to skyrocket 53%. Time spent looking at app ads is up 21%. Meanwhile, brands from Walgreens to Toyota are busily mining mobile data for insights into refill patterns and finding people in the market for a new ride. And it’s no wonder mobile analytics startups are on the hunt for data scientists.
Since marketing money follows people, it’s also not surprising to see mobile ad spend rising. Facebook crossed the 50% line this year, and in fact most of its revenue growth is coming from mobile ads. (Twitter is 81% mobile ad-fueled.) Which is great for a handful of platforms but does not describe a healthy ecosystem. It’s estimated that about half of approximately $12 billion in mobile ad spend this year will flow to a single company. (One guess who: Google.) Virtually all mobile search ad spend goes to Google. Any market where 80% of the value is owned by a handful of players (including Pandora and Apple) is not broad-shouldered.
Mobile marketing and advertising is a Version 1.5 discipline. This is not an insult, simply a description of maturity. It’s only been around five years (unless you’re Maclarening back to the SMS and beeper days). CPMs for mobile ads are still relatively low, quality inventory is thin on the ground, and there’s no real consensus on what a “mobile banner ad” should look like when it grows up.
Adding confusion to the chaos, mobile analytics is significantly harder than big browser analytics. (Here are ten reasons why.) Mobile and app analytics platforms proliferate in a way that resembles Web analytics back in, say, 2002. There are the big boys like Adobe and IBM, newer analytics solutions like Mixpanel, advertiser-focused players like Yahoo’s Flurry, site experience tools like Artisan, and location-targeting specialists like Placed. And the list goes on . . . and on.
Part of the problem is what we call the cross-device conundrum. Simply put, people are harder to track on their mobile devices. Apple’s iOS deprecates cookies by default. Both Apple and a consortium led by Google provide cookie-like identifiers to advertisers (called IDFA and AdID), but these are incompatible and limited in use. There are also statistical methods of tying people to devices, with varying accuracy, but targeting people both for marketing efficiency and deeper insights remains a lingering challenge.
This week, our Gartner for Marketing Leaders Editorial Calendar focuses on helping digital marketers to address this challenge. We provide an indispensable Market Guide to Mobile Marketing Analytics (subscription required) as well as other research to help you mobilize this key part of your analytics discipline.
Last blog out, we visited the fairytale castles of Hohenschwangau and Neuschwanstein. Cue a tenuous link to talk about someone who might live in such a place, the Data Fairy.
Full disclosure and credit where credit is due. I was introduced to the Data Fairy by former client and now good friend Liz Locksley. Liz has a fantastic background in business strategy, process change, and data-driven decision-making, and is now active in sustainable living, energy-efficiency and green technology. If you ever get the chance to work with her, take it. Liz is ace.
Anyway, back to the Data Fairy. Liz suggests that typical dialogue might go like this:
End User: ?I want a database.?
Liz: ?Who’s going to look after that then???
End User: ?Well, I thought you might? ?
Liz: ?Do I look like the Data Fairy?!?
Somewhat akin to her cousins the Tooth Fairy and Fairy Godmother, the Data Fairy is someone who is elusive, enigmatic and only really appears at time of great need. But if you?re lucky enough to find them, you can depend upon them to help you fulfill your heart?s desires or resolve some intractable problem. When the Data Fairy comes, you just make your wish for data, they wave their data wand, and data magic happens. Setting up a new data system, cleansing a low-quality data set, preparing a report that no one else could deliver ? these are all in a day?s work for the Data Fairy.
But remember, in the worlds of Fairy Tales, magic always comes at a cost. Rapunzel’s prince loses his eyes. Cinderella’s step-sisters cut off parts of their feet and had their eyes put out by birds. Pinnochio is hanged for his faults.
The Data Fairy?s enchantments are no different. With great power comes great responsibility, and relying on the Data Fairy to save the day is never the end of the story. The data magic only lasts as long as the Data Fairy?s spell ? at some point, they?ll be gone, and things will return back to normal, probably with unintended consequences. Heroes and heroines must ultimately stand up for themselves, find their answers from deep within themselves, and the happy ending only comes after hardships endured, adversity bravely faced and sorrowful lessons learned.
Do you have a Data Fairy in your organization? If so, treasure them. Just be careful what you wish for, and enjoy their magic while it lasts.
And if you are a Data Fairy, please get in touch and share your magical stories.
I was at IBM’s Insights event in Las Vegas the week for a couple of days. Here are some of my perspectives of the event. I also compare and contrast to what I observed while attending SAP’s TechEd the week before.
Bottom line: IBM has stuck to its “big data and analytics” message for the last 2 or 3 years. In some ways this was a tad repetitive, or so it seemed for a while. The main keynote seemed to have that feeling of sameness about it. However there was a new message this year that emerged as day 1 progressed. That new message is that the offerings talked about in previous years are now here, available, and ready for use. In other words, IBM’s big data and analytics solutions are real and some of its vision has been realized. I did not see too much new innovation however…
Now for the details.
Client examples were sprinkled through the various keynotes, and customer stories always help convey the vendor messages. I particularly liked the Pratt and Whitney (Internet of Things evolving predictive maintenance) and Ceva Logistics (evolving use and exploitation of supply chain asset oriented analytics) stories. The key tips from Ceva were “do taxonomy first, and do information retention policy early”. Sage advice, for sure.
Another new emphasis on an old topic is that concerning IBM Watson. Throughout the first day there were several new product announcements, and several played on the commercial availability of IBM Watson-powered products. These represented the leading, innovative examples that were loosely described in previous years.
Like SAP the week before, IBM has had a silver bullet looking for a problem to solve. It seems IBM has done a little more homework than SAP, in my view, in this area. IBM’s Watson seems to have come of age while SAP’s HANA is still in high school. Several commercial solutions were demoed including Chef Watson, targeted at you and me and our creative culinary juices. There was another demo for IBM Watson specific to healthcare (Oncology). IBM emphasized how these were packaged and targeted at specific roles and use cases. Packaged solutions are what Swiss Army Knives, such as IBM Watson and Sap HANA, need to be meaningful to business users.
As I listened to the speakers through day 1, several cool metaphors or phrase resonated with me. One concerned a phrase to define an industry leader. The phrase was, “two moves for one“. That sounded cool. The point being that in the time and money your competitor takes to make a single move, as if on a chess board, your organization could execute two. This is a powerful message and one way bigger and broader than mere analytics. This conveys the fan-favorite, “plan, do, check, act” ( a slight nod to i2 Technology’s Sanjiv Sidhu).
The second cool idea was teased by Beth Smith, GM Information Management, in her divisions’ keynote. She got my heart racing with “liquid data layer” then she promptly let me down and cool, by not following up and exploring the idea. She mentioned it one more time, as a near afterthought, and that was it. I thought I was going to be made weak at the knees with a pitch about semantic governance or something. Bummer.
One odd thought came to me, that had a sports flavor. It happens to apply to IBM and SAP. The question was: Why does Big data produce so many one-time data innovations?
Another odd thought came to mind during the ECM keynote. It was this: Over 90 per cent of what Doug Hunt (GM, ECM) referred to actually related to structured data, not unstructured data. In truth, he was referring to examples oriented towards content, but content with some contextual structure applied. Once structured, content becomes information even if it is not physically stored in an relational database. Such information should be subject to the exact same information governance policies and practice as master data. And this ties into a recent blog of mine (see Is it too soon for unstructured data governance) on the issue.
Open questions, considerations and queries
In the main keynote IBM over emphasized analytics as if analytics alone would save or improve your business. There was very little to talk about information Management per se; there was talk of faster boxes, servers and stuff, but that’s the ‘easy’ part to talk about. Governance, quality, and trust in data, and information value, was not the priority for this event headlines.
Additionally IBM had a key slide that they failed to explore, explain or exploit. The slide emphasized how insight from analytics leads to action and a change in (business) outcome. They did not explore enough, for me, that change in action. It’s as if the focus was on the stop-lights, and it was assumed you, the driver, know how to drive. There was good use of case management, but again, that is a poor substitute for all business apps that are the vehicle controls we all use every day.
From queries to wrinkles
A wrinkle for me: “Markets of one are a stopover”. That did not resonate with me but this was said several times in the keynotes. Markets of one is a concept that should sit at the heart of a new digital business strategy. IBM was trying to outdo the concept with the suggestion that markets of one does not imply the ability to interact and even influence the market. This is not true other than IBM says it is. No one said a market of one was meant to exclude interaction or influence. IBM had good content but the message chosen to convey it didn’t work with me.
One final wrinkle and thought left me wanting more. After the conclusion of the divisional keynotes (corporate, business leaders, information management, ECM, and business analytics), I was left wondering this: Why didn’t IBM call out the market discontinuity by not having separate keynotes for info mgt, ECM, and analytics? All keynotes included a major focus on analytics. And though the opening keynote was a kind of overview, it too was heavily focused itself on analytics. I’d like to see a vendor break the mold and avoid silod keynotes and focus on a matrixed message and structure.
So wrapping up…
I tweeted quite early on day 1 that for me, “taxonomy was tops”. What I meant to say or mean was that the new world IBM is talking about really points to the validity, quality and meaningful use of my taxonomy versus your taxonomy. At the end of the day, if we are using content or information, dark or big data, streaming or in-memory data, if we can’t express the semantic meaning of the data, we are done. IBM did not really explore or explain how this brave, new data work will unfold. I know IBM has cool stuff going on and new products rolling out. But I didn’t see IBM with a visible, rounded, public road map showing how all the necessary tools converge to align, prioritize, then operationally govern key information artifacts across an organization, or firewall. We all know IBM can build almost anything (they have a big bag of tools and products) but they do not, as yet, seem to want to push this particular envelope yet. I accept that there are few, if any, real buyers for such a vision but I think it is on our collective horizon.
HP’s risky plan takes two years for its first 3D printer to enter the market.
HP introduced its new Multi Jet Fusion 3D printing technology and its Sprout “blended reality” devices at a launch event in NYC earlier today.
Multi Jet Fusion 3D printing technology is derived from a combination of existing inkjet and fusion processes and will be developed with a range of customer and vendor partners. It aims to introduce the first printers in the second half of 2016. Pricing is not available but the printers will be targeted at enterprises and service bureaus, not consumers.
The Multi Jet Fusion technology uses thermoplastics, jetted materials and radiant energy. The technology resembles a binder inkjet process, but the inkjet head deposits a thermal fusion liquid. A heating element transfers energy into the fusing agent, causing localized fusion of the thermoplastic. The printhead also deposits a second material layer, promoting fine detail and surface finish.
Courtesy of HP
Other materials and chemicals can be deposited with the fluids, enabling the infusion of metals, ceramics and chemicals into the build material.
HP recently demonstrated for my colleague Martin Reynolds and I precision parts with good mechanical properties and reasonably smooth finish that were printed using optimized thermoplastics:
The results are claimed to be mechanically strong and highly precise, similar to thermoplastics formed by selective laser sintering (SLS) processes. HP indicates production rates are expected to be about 10 times faster than SLS. However, it is too early to know whether such high productivity applies to all use cases or if it is peak performance for certain situations, materials and/or products.
Multi Jet Fusion technology draws from HP?s expertise in inkjet hardware and chemistry for the 2D digital printer market. Based on its initial market research, the material will be a black thermoplastic but as you can see the development of colored materials is already well underway. The materials and chemicals reportedly will not require special handling for safety purposes.
As with other 3D printing technologies, potential uses of Multi Jet Fusion include new product development, prototyping and functional testing. Manufacturing, mass customization and the creation of alternative product designs without limitations of traditional manufacturing techniques will also be possible.
HP has the first product in development, but does not expect its 3D printers to reach the market for two years. While some commentators were expecting a product launch now or by the spring of 2015, HP feels that engaging in partnerships with customers and vendors now will help it refine the technology and determine the best initial applications.
Possible Printer Design by HP
HP?s partner-based approach provides opportunities for third parties to collaborate on and develop the printers. The idea of encouraging engagement and innovation from outside of HP may seem like a very non-native thing for HP to do but the concept was successfully applied to the development of its very high speed inkjet press product line.
The material range, part quality and speed possible with Multi Jet Fusion may disrupt the market if the first printers launch as planned. Yet rivals have plenty of time to innovate, making HP?s strategy risky. It must ensure the printers? material range and capabilities, productivity and pricing are not just significantly better than today?s technology but better than the technologies and 3D printer offerings that will be in the market in late 2016.
Subscribe to my blog posts ? Simply click on the RSS button at the top right
The Road to Enduring SFA Adoption Success Part 4
For years, I’ve told clients that achieving strong SFA value necessarily comes in four stages. From recent talks with clients about their SFA adoption and SFA strategic plans, I think it’s time to change the model. Below is my traditional model for achieving SFA success. It shows a gradual increase in busuiness impact as time increases, starting with managerial effectiveness.
The first three stages are correct and appropriate, but the value propositions are in the wrong order. Managerial effectiveness should be third, not first. And individual utilization and effectiveness needs to shift lower, to the Initial Usage stage.
I’m advocating for the new model because the reasons for the initial SFA investment — better pipeline visibility, higher forecast accuracy, better call tracking, etc.– simply aren’t enough to compel our customer-facing resources to use the SFA.
A colleague recently told me about a recent Inquiry that seems to be pretty common:
Executive Sponsor: “We have really poor adoption of our SFA. Reps don’t want to use the system. What could be causing it?”
Gartner: “Do you have any capabalities in the system that are meaningful to the reps?”
Executive Sponsor: ” Well, no not really….”
Executive Sponsor: “We have really poor adoption of our SFA. Reps don’t want to use the system. What could be causing it?”
Gartner: “Do you have any capabalities in the system that are meaningful to the reps?”
Executive Sponsor: ” Well, no not really….”
We have to give customer-facing resources the tools that make it easier for them to sell. They need deeply capable applications that combines functionality with process in a way that matches the way that they already already work. And they need accurate, timely data so they don’t have to work in multiple systems and don’t have to guess about which system holds the best source of truth. Only with these steps will the SFA be meaningful to how they work.
When that occurs, SFA usage and data accuracy increases. And when that milestone is reached, managers finally get the insight into sales performance that they need to be effective.
And forgive me if it seems like I am a late convert to this idea. For example, Frederick Newell wrote about this as early as 2003 in “Why CRM Doesn’t Work.”
Stay tuned for Part 5 of The Road to Enduring SFA Adoption Success, when I detail more about how to change the sequence of this value curve.
You’re probably thinking, “wow that is a bold statement”, and you would be right in thinking so. However, it doesn’t make it any less true. Over the past few months I’ve been working with Gartner IoT experts and listening and observing customers around the world in their usage of IoT technologies. One thing is clear, those that treat IoT as more than just a hobby or side project are certainly achieving substantial gains. As we know, the only way to do that is to create a meaningful strategy that put’s company value at the center.
This week I released two pieces of research that are directly tailored to the role of a chief architect enterprise architects. What you’ll find different in this research is that I go through IoT focused on the broad strategic impacts rather than a narrow individual solution. So language, methods and tools that come right out of the enterprise architectures toolbox is important. The goal of this research is to provide essential information that doesn’t necessarily go deep into the IoT space from an educational purpose but rather provides the constructs.
The first piece of research, “Leveraging Enterprise Architecture to Enable Business Value With IoT Innovations Today” is about what enterprise architects can do today to leverage Internet of things. As I said before, I focused less on the actual technology, that’s the easy part. It’s about what you do with it, how do you architect it to maximize value and how do you reduce the risks of adoption of this technology to both your internal operations and to your customers. You’ll find key positions that will guide you through what you can do today as an enterprise architect, not five years from now but today. This research is all about “teaching you how to fish” rather than “catching fish for you”. This is important because every situation is unique and require customization based on the unique business outcome you wish to achieve.
In the second piece of research, “Toolkit: What Enterprise Architects Need to Know About IoT Technologies” is an action oriented deliverable that is in a toolkit format. I provide you with a PowerPoint presentation that will give enterprise architects the diagnostic deliverables that relate to Internet of Things. What is unique about this Toolkit is it isn’t comprised of a bunch of random deliverables but they are put together in a cohesive manner applied to real world scenarios. It includes:
So what you’ll find is not only a set of great guidance for enterprise architects but also the deliverable in which they need to be successful. Keep in mind these are not an exhaustive list and not to represent all situations. However these examples should give food for thought and inspiration into how to think about this pervasive technology.
I hope you enjoy it if you have any comments, questions or general feedback I would love to hear it in the comments below.
Unlike with an on-premise SIEM or even still-mostly-mythical SaaS/cloud SIEM, with an MSSP contract you are paying for people and not just for the tools. This obvious fact ? that ?S? in MSSP stands for ?services? and service implies people ? somehow escapes some organizations. Let?s explore this a bit here. If you pick an MSSP partner with an amazing technology platform and unskilled, frequently-churning, lazy, perversely-motivated (tickets closed per hour, anybody?) personnel with questionable ethics and lack of proficiency in your language of choice, do you think your security monitoring capability will?
I think you get an idea Now, some of you may, in good faith, choose option 3). Frankly, I was thinking of coming up with some joke about it ? but became sad instead ?
A wise CSO once told me that in order to outsource a security process (such as security monitoring or device management) and achieve a great result, you have to know precisely how a great process of that kind looks like. Indeed, how would you know that your MSSP runs a great SOC, if you have never even seen one? The same applies to people. So, if you never hired and managed great security analysts, how would you know that your MSSP partner actually employs them? Sure, when you buy products you can rely on our research, the views of your peers or whatever other factors, but such methods are much harder for people and process aspects of your future MSSP relationship. So, I am sorry to break the news here, but thinking is involved!
One quality MSSP provider told me that his favorite MSSP client is one that knows exactly how an excellent security operations capability looks like (such as from his previous job, etc), but also knows that he cannot get one (no chance to hire, needs it faster than his can grow, etc, etc). This makes perfect sense: it is easier to conceptualize and understand a mature security monitoring operation than to actually have one materialize in your organization. Thus, if you know how one looks, you may be able to get that from that MSSP partner.
But back to people ? in essence, you need to spend time learning:
a) how does a great security analyst look like?
b) whether your chosen MSSP partner has them?
c) whether they will be assigned to your account?
Otherwise, that MSSP may be cheap ? rather than cost-effective. You want economies of scale in monitoring, not cheap crap in monitoring. And it is also your responsibility to understand the difference! So, learn about the security skill sets and relevant certifications, and then about whether the MSSP has them, and also whether their people have real experience fighting threats [and winning, at least occasionally :-)] and then continue checking whether that is still true as your relationship continues?
Finally, how was your experience with MSSP personnel?
Blog posts related to this research on MSSP usage:
El 31 de octubre no es solo Halloween para la mayoría del mundo; desde 1924 también ha sido el día del ahorro mundial. Para esta ocasión Samsung presenta los beneficios de ahorro de energía y agua que sus productos ofrecen en la búsqueda de soluciones para el hogar. Para lograr ayudar al ahorro de recursos, […]
Mientras los laptop se siguen haciendo cada vez más delgados ?con teclados que los acompañan en ese cambio?, varios sectores del mundo ?gamer? han retomado los viejos teclados mecánicos. Cómo señala PC World, los teclados mecánicos han ganado popularidad en los últimos años entre los jugadores que no se despegan de su computador.Continúa leyendo en ENTER.CODeja […]
El ambiente de Halloween tiene muchas caras, pero entre todas ellas hay una que se ha popularizado mucho en los últimos años: los zombis. Los muertos vivientes se han transformado en íconos del cine, la televisión, el cómic y, por supuesto, los videojuegos. Entra las propuestas para móviles se encuentra un título que combina acción, zombis, […]
Hoy es Halloween y con esta fecha llega la oportunidad de disfrutar de un sinfín de especiales, películas y temáticas dedicadas a la noche más espeluznante del año. Por eso queremos mostrarles las que, para nosotros, son las diez mejores ‘Casitas del Horror’ de ‘Los Simpson’. Aclaramos que esta elección fue hecha a gusto personal, […]
¿Alguien recuerda el Amazon Fire Phone? En junio de este año, la compañía creadora de la tableta Kindle y todos sus derivados, presentó al mundo, aunque centrado en el mercado estadounidense, su smartphone Fire Phone. Sin embargo, después de unos meses en el mercado, el equipo tan solo le ha generado pérdidas y gastos a […]
El congreso español aprobó una versión renovada de la Ley de Propiedad Intelectual, la cual empezará a regir en su territorio desde enero de 2015. En esta se crean nuevas figuras para proteger contenido en internet y se establecen medidas para combatir su robo como el controversial ?Impuesto Google?.Continúa leyendo en ENTER.CO3 comentarios en En […]
Uno de los personajes recomendables para empezar en ?Smite?, el MOBA de Level Up!, es el dios egipcio del sol: Ra. El personaje es clase mago y sus ataques a distancia permiten mantenerse alejado del peligro en el campo de batalla. Con movimientos como ?Celestial Beam? y ?Searing Pain?, Ra puede realizar ataques que afectan […]
Shigeru Miyamoto, la eminencia de los videojuegos de Nintendo, debutó en el mundo del cine durante el pasado Festival de cine de Tokyo, según reporta Hollywood Reporter. El genio de 61 años detrás de ?Donkey Kong?, ?Mario? y ?TLoZ? estaba ansioso de entrar en el mundo de la animación y el cine el año pasado, […]
La próxima semana el fundador y CEO de Facebook, Mark Zuckerberg, realizará una sesión de preguntas y respuestas, como lo informó él mismo en una nueva página de la red social llamada Q&A with Mark, la cual ya cuenta con más de 50.000 seguidores.Continúa leyendo en ENTER.CO2 comentarios en ¿Con ganas de preguntarle algo a […]
Desde el Tokyo Game Show del año pasado creemos que ?Final Fantasy XV? será aquella entrega que saque el verdadero potencial gráfico de las consolas de esta generación. Se ha hablado mucho de este Action RPG y de cómo Square Enix planea contar el viaje de carretera que mostró durante el más reciente E3. Y […]
Avenida 15 # 104-30 Of.305 Bogota, D.C., Colombia / PBX (571)467-3939/ 386-0994. Movil (57) 315 331-1740
Miami, FL., E.U. / Phone:(786)467-6722
Copyright 2014. All Rights Reserved.
Designed by ETRADE GROUP SAS.