Big Data: Lessons from the 2012 election


The US Presidential elections of 2012 confounded many pundits. The Republican candidate, Gov. Mitt Romney, put together a strong campaign and polls leading into the final week that suggested a close race. The final results were not so close, and Barack Obama handily won a second term.

Antony Young explains how the Obama campaign used big data, analytics and micro targeting to mobilize key voter blocks giving Obama the numbers needed to push him over the edge.

“The Obama camp in preparing for this election, established a huge Analytics group that comprised of behavioral scientists, data technologists and mathematicians. They worked tirelessly to gather and interpret data to inform every part of the campaign. They built up a voter file that included voter history, demographic profiles, but also collected numerous other data points around interests … for example, did they give to charitable organizations or which magazines did they read to help them better understand who they were and better identify the group of ‘persuadables‘ to target.”

That data was able to be drilled down to zip codes, individual households and in many cases individuals within those households.”

“However it is how they deployed this data in activating their campaign that translated the insight they garnered into killer tactics for the Obama campaign.

“Volunteers canvassing door to door or calling constituents were able to access these profiles via an app accessed on an iPad, iPhone or Android mobile device to provide an instant transcript to help them steer their conversations. They were also able to input new data from their conversation back into the database real time.

“The profiles informed their direct and email fundraising efforts. They used issues such Obama’s support for gay marriage or Romney’s missteps in his portrayal of women to directly target more liberal and professional women on their database, with messages that “Obama is for women,” using that opportunity to solicit contributions to his campaign.

“Marketers need to take heed of how the Obama campaign transformed their marketing approach centered around data. They demonstrated incredible discipline to capture data across multiple sources and then to inform every element of the marketing – direct to consumer, on the ground efforts, unpaid and paid media. Their ability to dissect potential prospects into narrow segments or even at an individual level and develop specific relevant messaging created highly persuasive communications. And finally their approach to tap their committed fans was hugely powerful. The Obama campaign provides a compelling case for companies to build their marketing expertise around big data and micro-targeting. How ready is your organization to do the same?”

Old dogs, new tricks


Doris Lessing passed away at the end of last year. She was the freewheeling Nobel Prize-winning writer on racism, colonialism, feminism and communism who died November 17 at the age of 94, was prolific for most of her life. But five years ago, she said the writing had dried up. “Don’t imagine you’ll have it forever,” she said, according to one obituary. “Use it while you’ve got it because it’ll go; it’s sliding away like water down a plug hole.”

In the very fast changing world of IT, it is common to feel like an old fogey. Everything changes at bewildering speed. From hardware specs to programming languages to user interfaces. We hear of wunderkinds whose innovations transform our very culture. Think Mozart, Zuckerberg to name two.

Tara Bahrampour examined the idea, and quotes author Mark Walton, “What’s really interesting from the neuroscience point of view is that we are hard-wired for creativity for as long as we stay at it, as long as nothing bad happens to our brain.”

The field also matters.

Howard Gardner, professor of cognition and education at the Harvard Graduate School of Education says, “Large creative breakthroughs are more likely to occur with younger scientists and mathematicians, and with lyric poets, than with individuals who create longer forms.”

In fields like law, psychoanalysis and perhaps history and philosophy, on the other hand, “you need a much longer lead time, and so your best work is likely to occur in the latter years. You should start when you are young, but there is no reason whatsoever to assume that you will stop being creative just because you have grey hair.” Gardner said.

Old dogs take heart; you can learn new tricks as long as you stay open to new ideas.

Fail How To: Top 3 SIEM implementation mistakes


Over the years, we had a chance to witness a large number of SIEM implementations, with results from the superb to the colossal failures. What is common with the failures? This blog by Keith Strier nails it:

1) Design Democracy: Find all internal stakeholders and grant all of them veto power. The result is inevitably a mediocre mess. The collective wisdom of the masses is not the best thing here. A super empowered individual is usually found at the center of the successful implementation. If multiple stakeholders are involved, this person builds consensus but nobody else has veto power.
2) Ignore the little things: A great implementation is a set of micro-experiences that add up to make the whole. Think of the Apple iPhone, every detail from the shape, size, appearance to every icon and gesture and feature converges to enhance the user experience. The path to failure is just focus on the big picture, ignore the little things from authentication to navigation and just launch to meet deadline.

3) Avoid Passion: View the implementation as non-strategic overhead; implement and deploy without passion. Result? At best, requirements are fulfilled but users are unlikely to be empowered. Milestones may be met but business sponsors still complain. Prioritizing deadlines, linking IT staff bonuses to delivery metrics, squashing creativity is a sure way to launch technology failures that crush morale.”

Monitoring File Permission Changes with the Windows Security Log


Unstructured data access governance is a big compliance concern.  Unstructured data is difficult to secure because there’s so much of it, it’s growing so fast and it is user created so it doesn’t automatically get categorized and controlled like structured data in databases.  Moreover unstructured data is usually a treasure trove of sensitive and confidential information in a format that bad guys can consume and understand without reverse engineering the relationship of tables in a relational database.

Digital detox: Learning from Luke Skywalker


For any working professional in 2013, multiple screens, devices and apps are integral instruments for success. The multitasking can be overwhelming and dependence on gadgets and Internet connectivity can become a full-blown addiction.

There are digital detox facilities for those whose careers and relationships have been ruined by extreme gadget use. Shambhalah Ranch in Northern California has a three-day retreat for people who feel addicted to their gadgets. For 72 hours, the participants eat vegan food, practice yoga, swim in a nearby creek, take long walks in the woods, and keep a journal about being offline. Participants have one thing in common: they’re driven to distraction by the Internet.

Is this you? Checking e-mail in the bathroom and sleeping with your cell phone by your bed are now considered normal. According to the Pew Research Center, in 2007 only 58 percent of people used their phones to text; last year it was 80 percent. More than half of all cell phone users have smartphones, giving them Internet access all the time. As a result, the number of hours Americans spend collectively online has almost doubled since 2010, according to ComScore, a digital analytics company.

Teens and twentysomethings are the most wired. In 2011, Diana Rehling and Wendy Bjorklund, communications professors at St. Cloud State University in Minnesota, surveyed their undergraduates and found that the average college student checks Facebook 20 times an hour.

So what can Luke Skywalker teach you? Shane O’Neill says it well:

“The climactic Death Star battle scene is the centerpiece of the movie’s nature vs. technology motif, a reminder to today’s viewers about the perils of relying too much on gadgets and not enough on human intuition. You’ll recall that Luke and his team of X-Wing fighters are attacking Darth Vader’s planet-size command center. Pilots are relying on a navigation and targeting system displayed through a small screen (using gloriously outdated computer graphics) to try to drop torpedoes into the belly of the Death Star. No pilot has succeeded, and a few have been blown to bits.

“Luke, an apprentice still learning the ways of The Force from the wise — but now dead — Obi-Wan Kenobi, decides to put The Force to work in the heat of battle. He pushes the navigation screen away from his face, shuts off his “targeting computer” and lets The Force guide his mind and his jet’s torpedo to the precise target.

“Luke put down his gadget, blocked out the noise and found a quiet place of Zen-like focus. George Lucas was making an anti-technology statement 36 years ago that resonates today. The overarching message of Star Wars is to use technology for good. Use it to conquer evil, but don’t let it override your own human Force. Don’t let technology replace you.

Take a lesson from a great Jedi warrior. Push the screen away from time to time and give your mind and personality a chance to shine. When it’s time to use the screen again, use it for good.”

Looking back: Operation Buckshot Yankee & agent.btz


It was the fall of 2008. A variant of a three year old relatively benign worm began infecting U.S. military networks via thumb drives.

Deputy Defense Secretary William Lynn wrote nearly two years later that the patient zero was traced to an infected flash drive that was inserted into a U.S. military laptop at a base in the Middle East. The flash drive’s malicious computer code uploaded itself onto a network run by the U.S. Central Command. That code spread undetected on both classified and unclassified systems, establishing what amounted to a digital beachhead, from which data could be transferred to servers under foreign control. It was a network administrator’s worst fear: a rogue program operating silently, poised to deliver operational plans into the hands of an unknown adversary.

The worm, dubbed agent.btz, caused the military’s network administrators major headaches. It took the Pentagon nearly 14 months of stop and go effort to clean out the worm — a process the military called Operation Buckshot Yankee. It was so hard to do that it led to a major reorganization of the information defenses of the armed forces, ultimately causing the new Cyber Command to come into being.

So what was agent.btz? It was a variant of the SillyFDC worm that copies itself from removable drive to computer and back to drive again. Depending on how the worm is configured, it has the ability to scan computers for data, open backdoors, and send through those backdoors to a remote command and control server.

To keep it from spreading across a network, the Pentagon banned thumb drives and the like from November 2008 to February 2010. You could also disable Windows’ “autorun” feature, which instantly starts any program loaded on a drive.

As Noah Shachtman noted, the havoc caused by agent.btz has little to do with the worm’s complexity or maliciousness — and everything to do with the military’s inability to cope with even a minor threat. “Exactly how much information was grabbed, whether it got out, and who got it — that was all unclear,” says an officer who participated in the operation. “The scary part was how fast it spread, and how hard it was to respond.”

Gen. Kevin Chilton of U.S. Strategic Command said, “I asked simple questions like how many computers do we have on the network in various flavor, what’s their configuration, and I couldn’t get an answer in over a month.” As a result, network defense has become a top-tier issue in the armed forces. “A year ago, cyberspace was not commanders’ business. Cyberspace was the sys-admin guy’s business or someone in your outer office when there’s a problem with machines business,” Chilton noted. “Today, we’ve seen the results of this command level focus, senior level focus.”

What can you learn from Operation Buckshot Yankee?
a) That denial is not a river in Egypt
b) There are well known ways to minimize (but not eliminate) threats
c) It requires command level, senior level focus; this is not a sys-admin business

Defense in Depth – The New York Times Case


In January 2013, the New York Times accused hackers from China with connections to its military of successful penetrating its network and gained access to the logins of 53 employees, including Shanghai bureau chief David Barboza who last October published an embarrassing article on the vast secret wealth of China’s prime minister, Wen Jiabao.

This came to light when AT&T noticed unusual activity which it was unable to trace or deflect. A security firm was brought into conduct a forensic investigation that uncovered the true extent of what had been going on.

Over four months starting in September 2012, the attackers had managed to install 45 pieces of targeted malware designed to probe for data such as emails after stealing credentials, only one of which was detected by the installed antivirus software from Symantec. Although the staff logins were hashed, that doesn’t appear to have stopped the hackers in this instance. Perhaps, the newspaper suggests, because they were able to deploy rainbow tables to beat the relatively short passwords.

Symantec offered this statement: “Turning on only the signature-based anti-virus components of endpoint solutions alone are not enough in a world that is changing daily from attacks and threats.”

Still think that basic AntiVirus and firewall is enough? Take it directly from Symantec – you need to monitor and analyze data from inside the enterprise for evidence of compromise. This is Security Information and Event Management (SIEM).

Cyber Pearl Harbor a myth?


Eric Gartzke writing in International Security argues that attackers don’t have much motive to stage a Pearl Harbor-type attack in cyberspace if they aren’t involved in an actual shooting war.

Here is his argument:

It isn’t going to accomplish any very useful goal. Attackers cannot easily use the threat of a cyber attack to blackmail the U.S. (or other states) into doing something they don’t want to do. If they provide enough information to make the threat credible, they instantly make the threat far more difficult to carry out. For example, if an attacker threatens to take down the New York Stock Exchange through a cyber attack, and provides enough information to show that she can indeed carry out this attack, she is also providing enough information for the NYSE and the U.S. Government to stop the attack.

Cyber attacks usually involve hidden vulnerabilities — if you reveal the vulnerability you are attacking, you probably make it possible for your target to patch the vulnerability. Nor does it make sense to carry out a cyber attack on its own, since the damage done by nearly any plausible cyber attack is likely to be temporary.

Points to ponder:

  • Most attacks are occurring against well known vulnerabilities; systems that are unpatched
  • Most attacks are undetected and systems are “pwned” for weeks/months
  • The disruption caused when attacks are discovered are significant both in human and cost terms
  • There was little logic in the 9/11 attacks other than to cause havoc and fear (i.e., terrorists are not famous for logical well thought out reasoning)

Coming to commercial systems, attacks are usually for monetary gain. Attacks are often performed because “they can” [Remember George Mallory famously quoted as having replied to the question “Why do you want to climb Mount Everest?” with the retort “Because it’s there”].

Did Big Data destroy the U.S. healthcare system?


The problem-plagued rollout of healthcare.gov has dominated the news in the USA. Proponents of the Affordable Care Act (ACA) urge that teething problems are inevitable and that’s all these are. In fact, President Obama has been at pains to say the ACA is more than just a website. Opponents of the law see the website failures as one more indicator that it is unworkable.

The premise of the ACA is that young healthy persons will sign up in large numbers and help defray the costs expected from older persons and thus provide a good deal for all. It has also been argued that the ACA is a good deal for young healthies. The debate between proponents of the ACA and the opponents of ACA hinge around this point. See for example, the debate (shouting match?) between Dr. Zeke Emmanuel and James Capretta on Fox News Sunday. In this segment, Capretta says the free market will solve the problem (but it hasn’t so far, has it?) and so Emmanuel says it must be mandated.

So when then has the free market not solved the problem? Robert X. Cringely argues that big data is the culprit. Here’s his argument:

– In the years before Big Data was available, actuaries at insurance companies studied morbidity and mortality statistics in order to set insurance rates. This involved metadata — data about data — because for the most part the actuaries weren’t able to drill down far enough to reach past broad groups of policyholders to individuals. In that system, insurance company profitability increased linearly with scale, so health insurance companies wanted as many policyholders as possible, making a profit on most of them.

– Enter Big Data. The cost of computing came down to the point where it was cost-effective to calculate likely health outcomes on an individual basis.

– Result? The health insurance business model switched from covering as many people as possible to covering as few people as possible — selling insurance only to healthy people who didn’t much need the healthcare system. The goal went from making a profit on most enrollees to making a profit on all enrollees.

Information Security Officer Extraordinaire


Last year at this time, the running count already totaled approximately 27.8 million records compromised and 637 breaches reported. This year, that tally so far equals about 10.6 million records compromised and 483 breaches reported. It’s a testament to the progress the industry has made in the fundamentals of compliance and security best practices. But this year’s record is clearly far from perfect.

The VARs tale


The Canterbury Tales is a collection of stories written by Geoffrey Chaucer at the end of the 14th century. The tales were a part of a story telling contest between pilgrims going to Canterbury Cathedral with the prize being a free meal on their return. While the original is in Middle English, here is the VARs tale in modern day English.

In the beginning, the Value Added Reseller (VAR) represented products to the channel and it was good. Software publishers of note always preferred the indirect sales model and took great pains to cultivate the VAR or channel, and it was good. The VAR maintained the relationship with the end user and understood the nuances of their needs. The VAR gained the trust of the end user by first understanding, then recommending and finally supporting their needs with quality, unbiased recommendations, and it was good. End users in turn, trusted their VAR to look out for their needs and present and recommend the most suitable products.

Then came the cloud which appeared white and fluffy and unthreatening to the end user. But dark and foreboding to the VAR, the cloud was. It threatened to disrupt the established business model. It allowed the software publisher to sell product directly to the end user and bypass the VAR. And it was bad for the VAR. Google started it with Office Apps. Microsoft countered with Office 365. And it was bad for the VAR. And then McAfee did the same for their suite of security products. Now even the security focused VARs took note. Woe is me, said the VAR. Now software publishers are selling directly to the end user and I am bypassed. Soon the day will come when cats and dogs are friends. What are we to do?

Enter Quentin Reynolds who famously said, If you can’t lick ‘em, join them.” Can one roll back the cloud? No more than King Canute could stop the tide rolling in. This means what, then? It means a VAR must transition from being a reseller of product to one of services or better yet, a provider of services. In this way, may the VAR regain relevance with the end user and cement the trust built up over the years, between them.

Thus the VARs tale may have a happy ending wherein the end user has a more secure network, and the auditor being satisfied, returns to his keep and the VAR is relevant again.

Which service would suit, you ask? Well, consider one that is not a commodity, one that requires expertise, one that is valued by the end user, one that is not a set-and-forget. IT Security leaps to mind; it satisfies these criteria. Even more within this field is SIEM, Log Management, Vulnerability scan and Intrusion Detection, given their relevance to both security and regulatory compliance.

Auditing File Shares with the Windows Security Log


Over the years, security admins have repeatedly asked me how to audit file shares in Windows.  Until Windows Server 2008, there were no specific events for file shares.  The best we could do was to enable auditing of the registry key where shares are defined.  But in Windows Server 2008 and later, there are two new subcategories for share related events

The air gap myth


As we work with various networks to implement IT Security in general and SIEM, Log Management and Vulnerability scanning in particular, we sometimes meet with teams that inform us that they have air gapped networks. An air gap is a network security measure that consists of ensuring physical isolation from unsecured networks (like the Internet for example). The premise here being harmful packets cannot “leap” across the air gap. This type of measure is more often seen in utility and defense installations. Are they really effective in improving security?

A study by the Idaho National Laboratory shows that in the utility industry, while an air gap may provide defense, there are many more points of vulnerability in older networks. Often, critical industrial equipment is of older vintage when insecure coding practices were the norm. Over the years, such systems have had web front ends grated on to them to ease configuration and management. This makes them very vulnerable indeed. In addition these older systems are often missing key controls such as encryption. When automation is added to such systems (to improve reliability or reduce operations cost), the potential for damage is quite high indeed.

In a recent interview, Eugene Kaspersky stated that the ultimate air gap had been compromised. The International Space Station, he said, suffered from virus epidemics. Kaspersky revealed that Russian astronauts carried a removable device into space which infected systems on the space station. He did not elaborate on the impact of the infection on operations of the International Space Station (ISS). Kaspersky doesn’t give any details about when the infection he was told about took place, but it appears as if it was prior to May of this year when the United Space Alliance, the group which oversees the operation of the ISS, moved all systems entirely to Linux to make them more “stable and reliable.”

Prior to this move the “dozens of laptops” used on board the space station had been using Windows XP. According to Kaspersky, the infections occurred on laptops used by scientists who used Windows as their main platform and carried USB sticks into space when visiting the ISS. A 2008 report on ExtremeTech said that a Windows XP laptop was brought onto the ISS by a Russian astronaut infected with the W32.Gammima.AG worm, which quickly spread to other laptops on the station – all of which were running Windows XP.

If the Stuxnet infection from June 2010 wasn’t enough evidence, this should lay the air gap myth to rest.

End(er’s) game: Compliance or Security?


Who do you fear more – The Auditor or The Attacker? The former plays by well-established rules, gives plenty of prior notice before arriving on your doorstep and is usually prepared to accept a Plan of Action with Milestones (POAM) in case of deficiencies. The latter gives no notice, never plays fair and will gleefully exploit any deficiencies. Notwithstanding this, most small enterprises, actually fear the auditor more and will jump through hoops to minimize their interaction. It’s ironic, because the auditor is really there to help; the attacker, obviously is not.

While it is true that 100% compliance is not achievable (or for that matter desirable), it is also true that even the most basic of steps towards compliance go a long way to deterring attackers. The comparison to the merits of physical exercise is an easy one. How often have you heard it said that even mild physical exercise (taking the steps instead of elevator) gives you benefit? You don’t have to be a gym rat, pumping iron for hours every day.

And so, to answer the question: What comes first, Compliance or Security? It’s Security really, because Compliance is a set of guidelines to help you get there with the help of an Auditor. Not convinced? The news is rife with accounts of exploits which in many cases are at organizations that have been certified compliant. Obviously there is no such thing as being completely secure, but will you allow the perfect to be the enemy of the good?

The National Institutes of Standards (NIST) released Rev 4 of its seminal publication 800-53, one that applies to US Government IT systems. As budgets (time, money, people) are always limited, it all begins with risk classification, applying  scarce resources in order of value. There are other guidelines such as the SANS Institute Consensus Audit Guidelines to help you make the most of limited resources.

You may not have trained like Ender Wiggin from a very young age through increasingly difficult games, but it doesn’t take a tactical genius to recognize “Buggers” as attackers and Auditors as the frenemies.

Looking for assistance with your IT Security needs? Click here for our newest publication and learn how you can simplify with services.

Simplifying SIEM


Since its inception, SIEM has been something for the well-to-do IT Department; the one that can spend tens or hundreds of thousands of dollars on a capital acquisition of the technology and then afford the luxury of qualified staff to use it in the intended manner. In some cases, they hire experts from the SIEM vendor to “man the barricades.” In the real world of a typical IT Department in the Medium Enterprise or Small Business, this is a ride in Fantasy Land. Budgets simply do not allow capital expenditures of multiple six or even five figures; expert staff, to the extent they exist, are hardly idling and available to work the SIEM console; and hiring outside experts – the less said, the better. And so, SIEM has remained the in the province of the well heeled.

Three common SMB mistakes


Small and medium business (SMB) owners/managers understand that IT plays a vital role within their companies. However, many SMBs are still making simple mistakes with the management of their IT systems, which are costing them money.

1) Open Source Solutions In a bid to reduce overall costs, many SMBs look to open source applications and platforms. While such solutions appear attractive because of low or no license costs, the effort required for installation, configuration, operation, maintenance and ongoing upgrades should be factored in. The total cost of ownership of such systems are generally ignored or poorly understood. In many cases, they may require a more sophisticated (and therefore more expensive and hard to replace) user to drive them.

2) Migrating to the Cloud Cloud based services promise great savings, which is always music to an SMB manager/owner’s ears, and the entire SaaS market has exploded in recent years. However the costs savings are not always obvious or tangible. The Amazon ec2 service is often touted as an example of cost savings but it very much depends on how you use the resource. See this blog for an example. More appropriate might be a hybrid system that keeps some of the data and services in-house, with others moving to the cloud.

3) The Knowledge Gap Simply buying technology, be it servers or software, does not provide any tangible benefit. You have to integrate it into the day-to-day business operation. This takes expertise both with the technology and your particular business.

In the SIEM space, these buying objections have often stymied SMBs from adopting the technology, despite its benefits and repeated advice from experts. To overcome these, we offer a managed SIEM offering called SIEM Simplified.

The Holy Grail of SIEM


Merriam Webster defines “holy grail” as a goal that is sought after for its great significance”. Mike Rothman of Securosis has described a twofold response to what the “holy grail” is for a security practitioner, i.e.,

  1. A single alert specifying exactly what is broken, with relevant details and the ability to learn the extent of the damage
  2. Make the auditor go away, as quickly and painlessly as possible

How do you achieve the first goal? Here are the steps:

  • Collect log information from every asset on the enterprise network,
  • Filter it through vendor provided intelligence on its significance
  • Filter it through local configuration to determine its significance
  • Gather and package related, relevant information – the so-called 5 Ws (Who, What, Where, When and Why)
  • Alert the appropriate person in the notification method they prefer (email, dashboard, ticket etc.)

This is a fundamental goal for SIEM systems like EventTracker, and over the ten plus years working on this problem, we’ve got a huge collection of intelligence to draw on to help configure and tune the system to you needs. Even so, there is an undefinable element of luck to have it all work out for you, just when you need it. Murphy’s Law says that luck is not on your side. So now what?

One answer we have found is Anomalous Behavior detection. Learn “normal” behavior during a baseline period and draw the attention of a knowledgeable user to out of ordinary or new items. When you join these two systems, you get coverage for both known-knowns as well as unknown-unknowns.

The second goal involves more discipline and less black magic. If you are familiar with the audit process, then you may know that it’s all about preparation and presentation. The Duke of Wellington famously remarked that the “Battle of Waterloo was won on the playing fields of Eton” another testament to winning through preparation. Here again, to enable diligence, EventTracker Enterprise  offers several features including report/alert annotation, summary report on reports, incident acknowledgement and an electronic logbook to record mitigation and incident handling actions.

Of course, all this requires staff with the time and training to use the features. Lack time and resources you say? We’ve got you covered with SIEM Simplified, a co-sourcing option where we do the heavy lifting leaving you to sip from the Cup of Jamshid.

Have neither the time, nor the tools, nor budget? Then the story might unfold like this.

SIEM vs Search Engine


The pervasiveness of Google in the tech world has placed the search function in a central locus of our daily routine. Indeed many of the most popular apps we use every day are specialized forms of search. For example:

  • E-Mail is a search for incoming msgs; search by sender, by topic, by key phrase, by thread
  • Voice calling or texting is preceded by a search for a contact
  • Yelp is really searching for a restaurant
  • The browser address bar is in reality a search box

And the list goes on.

In the SIEM space, the rise of Splunk, especially when coupled with the promise of “big data”, has led to speculation that SIEM is going to be eclipsed by the search function. Let’s examine this a little more closely, especially from the viewpoint of an expert constrained Small Medium Enterprise (SME) where Data Scientists are not idling aplenty.

Big data and accompanying technologies are, at present, more developer level elements that require assembly with application code or intricate setup and configuration before they can be used by typical system administrators much less mid-level managers. To leverage the big-data value proposition of such platforms, the core skill required by such developers is thinking about distributed computing where the processing is performed in batches across multiple nodes. This is not a common skill set in the SME.

Assuming the assembly problem is somehow overcome, can you rejoice in your big-data-set and reduce the problems that SIEM solves to search queries? Well maybe, if you are a Data Scientist and know how to use advanced analytics. However, SIEM functions include things like detecting cyber-attacks, insider threats and operational conditions such as app errors – all pesky real-time requirements. Not quite so effective as a search on archived and indexed data of yesterday. So now the Data Scientist must also have infosec skills and understand the IT infrastructure.

You can probably appreciate that decent infosec skills such as network security, host security, data protection, security event interpretation, and attack vectors do not abound in the SME. There is no reason to think that the shortage of cyber-security professionals and the ultra-shortage of data scientists and experienced Big Data programmers will disappear anytime soon.

So how can an SME leverage the promise of big-data now? Well, frankly EventTracker has been grappling with the challenges of massive, diverse, fast data for many years before became popularly known as Big Data. In testing on COTS hardware, our recent 7.4 release showed up to a 450% increase in receiver/archiver performance over the previous 7.3 release on the same hardware. This is not an accident. We have been thinking and working on this problem continuously for the last 10 years. It’s what we do. This version also has advanced data-science methods built right in to the EventVault Explorer, our data-mart engine so that security analysts don’t need to be data scientists. Our behavior module incorporates data visualization capabilities to help users recognize hidden patterns and relations in the security data, the so-called “Jeopardy” problem wherein the answers are present in the data-set, the challenge is in asking the right questions.

Last but not the least, we recognize that notwithstanding all the chest-thumping above, many (most?) SMEs are so resource constrained that a disciplined SOC-style approach to log review and incident handling is out of reach. Thus we offer SIEM Simplified, a service where we do the heavy lifting leaving the remediation to you.

Search engines are no doubt a remarkably useful innovation that has transformed our approach to many problems. However, SIEM satisfies specific needs in today’s threat, compliance and operations environment that cannot be satisfied effectively or efficiently with a raw big-data platform.

Resistance is futile


The Borg are a fictional alien race that are a terrifying antagonist in the Star Trek franchise. The phrase “Resistance is futile” is best delivered by Patrick Stewart in the episode The Best of Both Worlds.

When IBM demonstrated the power of Watson in 2011 by defeating two of the best humans to ever play Jeopardy, Ken Jennings who won 74 games in a row admitted in defeat, “I, for one, welcome our new computer overlords.”

As the Edward Snowden revelations about the collection of metadata for phone calls became known, the first thinking was that it would be technically impossible to store data for every single phone call – the cost would be prohibitive. Then Brewster Kahle, one of the engineers behind the Internet Archive made this spreadsheet to calculate the storage cost to record and store one year’s-worth of all U.S. calls. He works the cost to about $30M which is non-trivial but not out of reach by any means for a large US Gov’t agency.

The next thought was – ok so maybe it’s technically feasible to record every phone call, but how could anyone possibly listen to every call? Well obviously this is not possible, but can search terms be applied to locate “interesting” calls? Again, we didn’t think so, until another N.S.A. document, cited by The Guardian, showed a “global heat map” that appeared to represent how much data the N.S.A. sweeps up around the world. If it were possible to efficiently mine metadata, data about who is calling or e-mailing, then the pressure for wiretapping and eavesdropping on communications becomes secondary.

This study in Nature shows that just four data points about the location and time of a mobile phone call, make it possible to identify the caller 95 percent of the time.

IBM estimates that thanks to smartphones, tablets, social media sites, e-mail and other forms of digital communications, the world creates 2.5 quintillion bytes of new data daily. Searching through this archive of information is humanly impossible, but precisely what a Watson-like artificial intelligence is designed to do. Isn’t that exactly what was demonstrated in 2011 to win Jeopardy?

Savvy IT Is The Way To Go


There is a lot of discussion in the context of cloud as well as traditional computing regarding Smart IT, Smarter Planets, Smart and Smarter Computing. Which makes a lot of sense in light of the explosion in the amount of collected data and the massive efforts aimed at using analytics to yield insight, information and intelligence about — well, just about everything. We have no problem with smart activities.

The Dark Side of Big Data


study published in Nature looked at the phone records of some 1.5 million mobile phone users in an undisclosed small European country, and found it took only four different data points on the time and location of a call to identify 95% of the people. In the dataset, the location of an individual was specified hourly with a spatial resolution given by the carrier’s antennas.

Mobility data is among the most sensitive data currently being collected. It contains the approximate whereabouts of individuals and can be used to reconstruct individuals’ movements across space and time. A simply anonymized dataset does not contain name, home address, phone number or other obvious identifier. For example, the Netflix Challenge provided a training dataset of 100,480,507 movie ratings each of the form <user, movie, date-of-grade, grade> where the user was an integer ID.

Yet, if individual’s patterns are unique enough, outside information can be used to link the data back to an individual. For instance, in one study, a medical database was successfully combined with a voters list to extract the health record of the governor of Massachusetts. In the case of the Netflix data set, despite the attempt to protect customer privacy, it was shown possible to identify individual users by matching the data set with film ratings on the Internet Movie Database. Even coarse data sets provide little anonymity.

The issue is making sure the debate over big data and privacy keeps up with the science. Yves-Alexandre de Montjoye, one of the authors of the Nature article, says that the ability to cross-link data, such as matching the identity of someone reading a news article to posts that person makes on Twitter, fundamentally changes the idea of privacy and anonymity.

Where do you, and by extension your political representative, stand on this 21st Century issue?

The Intelligence Industrial Complex


If you are old enough to remember the 1988 election in the USA for President, then the name Gary Hart may sound familiar. He was the clear frontrunner after his second Senate term from Colorado was over. He was caught in an extra-marital affair and dropped out of the race. He has since earned a doctorate in politics from Oxford and accepted an endowed professorship at the University of Colorado at Denver.

In this analysis, he quotes President Dwight Eisenhower, “…we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists, and will persist.”

His point is that the US now has an intelligence-industrial complex composed of close to a dozen and a half federal intelligence agencies and services, many of which are duplicative, and in the last decade or two the growth of a private sector intelligence world. It is dangerous to have a technology-empowered government capable of amassing private data; it is even more dangerous to privatize this Big Brother world.

As has been extensively reported recently, the Foreign Intelligence Surveillance Act (FISA) courts are required to issue warrants, as the Fourth Amendment  (against unreasonable search and seizure) requires, upon a showing that the national security is endangered. This was instituted in the early 1970s following the findings of serious unconstitutional abuse of power. He asks “Is the Surveillance State — the intelligence-industrial complex — out of the control of the elected officials responsible for holding it accountable to American citizens protected by the U.S. Constitution?

We should not have to rely on whistle-blowers to protect our rights.

In a recent interview with Charlie Rose of PBS, President Obama said, “My concern has always been not that we shouldn’t do intelligence gathering to prevent terrorism, but rather: Are we setting up a system of checks and balances?” Despite this he avoided answering how no request to a FISA court has ever been rejected, that companies that provide data on their customers are under a gag order that even prevents them for disclosing the requests.

Is the Intelligence-Industrial complex calling the shots? Does the President know a lot more than he can reveal? Clearly he is unwilling to even consider changing his predecessor policy.

It would seem that Senator Hart has a valid point. If so, its a lot more consequential than Monkey Business.

Introducing EventTracker Log Manager


The IT team of a Small Business has it the worst. Just 1-2 administrators to keep the entire operation running, which includes servers, workstations, patching, anti-virus, firewalls, applications, upgrades, password resets…the list goes on. It would be great to have 25 hours in a day and 4 hands per admin just to keep up. Adding security or compliance demands to the list just make it that much harder.

The path to relief? Automation, in one word. Something that you can “fit-and-forget”.

You need a solution which gathers all security information from around the network, platforms, network devices, apps etc. and that knows what to do with it. One that retains it all efficiently and securely for later if-needed for analysis, displays it in a dashboard for you to examine at your convenience, alerts you via e-mail/SMS etc. if absolutely necessary, indexes it all for fast search, and finds new or out-of-ordinary patterns by itself.

And you need it all in a software-only package that is quickly installed on a workstation or server. That’s what I’m talking about. That’s EventTracker Log Manager.

Designed for the 1-2 sys admin team.
Designed to be easy to use, quick to install and deploy.
Based on the same award-winning technology that SC Magazine awarded a perfect 5-star rating to in 2013.

How do you spell relief? E-v-e-n-t-T-r-a-c-k-e-r  L-o-g  M-a-n-a-g-e-r.
Try it today.

Following a User’s Logon Tracks throughout the Windows Domain


What security events get logged when a user logs on to their workstation with a domain account and proceeds to run local applications and access resources on servers in the domain? When a user logs on at a workstation with their domain account, the workstation contacts domain controller via Kerberos and requests a ticket granting ticket (TGT).

Secure your electronic trash


At the typical office, computer equipment becomes obsolete, slow etc. and periodically requires replacement or refresh. This includes workstations, servers, copy machines, printers etc. Users who get the upgrades are inevitably pleased and carefully move their data carefully to the new equipment and happily release the older ones. What happens after this? Does someone cart them off the local recycling post? Do you call for a dumpster? This is likely the case of the Small Medium Enterprise whereas large enterprises may hire an electronics recycler.

This blog by Kyle Marks appeared in the Harvard Business Review and reminds us that sensitive data can very well be leaked via decommissioned electronics also.

A SIEM solution like EventTracker is effective when leakage occurs from connected equipment or even mobile laptops or those that connect infrequently. However, disconnected and decommissioned equipment is invisible to a SIEM solution.

If you are subject to regulatory compliance, leakage is leakage. Data security laws mandate that organizations implement “adequate safeguards” to ensure privacy protection of individuals.  It’s equally applicable to that leakage comes from your electronic trash. You are still bound to safeguard the data.

Marks points out that detailed tracking data, however, reveals a troubling fact: four out of five corporate IT asset disposal projects had at least one missing asset. More disturbing is the fact that 15% of these “untracked” assets are devices potentially bearing data such as laptops, computers, and servers.

Treating IT asset disposal as a “reverse procurement” process will deter insider theft. This is something that EventTracker cannot help with but is equally valid in addressing compliance and security regulations.

You often see a gumshoe or Private Investigator in the movies conduct Trash Archaeology in looking for clues. Now you know why.

What did Ben Franklin really mean?


In the aftermath of the disclosure of the NSA program called PRISM by Edward Snowden to a reporter at The Guardian, commentators have gone into overdrive and the most iconic quote is one attributed to Benjamin Franklin “Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety”.

It was amazing that something said over 250 years ago would be so apropos. Conservatives favor an originalist interpretation of documents such as the US Constitution (see Federalist Society) and so it seemed possible that very similar concerns existed at that time.

Trying to get to the bottom of this quote, Ben Wittes of Brookings wrote that it does not mean what it seems to say.

The words appear originally in a 1755 letter that Franklin is presumed to have written on behalf of the Pennsylvania Assembly to the colonial governor during the French and Indian War. The Assembly wished to tax the lands of the Penn family, which ruled Pennsylvania from afar, to raise money for defense against French and Indian attacks. The Penn family was willing to acknowledge the power of the Assembly to tax them.  The Governor, being an appointee of the Penn family, kept vetoing the Assembly’s effort. The Penn family later offered cash to fund defense of the frontier–as long as the Assembly would acknowledge that it lacked the power to tax the family’s lands.

Franklin was thus complaining of the choice facing the legislature between being able to make funds available for frontier defense versus maintaining its right of self-governance. He was criticizing the Governor for suggesting it should be willing to give up the latter to ensure the former.

The statement is typical of Franklin style and rhetoric which also includes “Sell not virtue to purchase wealth, nor Liberty to purchase power.”  While the circumstances were quite different, it seems the general principle he was stating is indeed relevant to the Snowden case.

What is happening to log files? The Internet of Things, Big Data, Analytics, Security, Visualization – OH MY!


Over the past year, enterprise IT has had more than a few things emerge to frustrate and challenge it. High on the list has to be limited budget growth in the face of increasing demand for and expectations of new services. In addition, there has been an explosion in the list of technologies and concerns that appear to be particularly intended to complicate the task of maintaining smooth running operations and service delivery.

What, me worry?


Alfred E. Nueman is the fictitious mascot and cover boy of Mad Magazine. Al Feldstein, who took over as editor in 1956, said, “I want him to have this devil-may-care attitude, someone who can maintain a sense of humor while the world is collapsing around him”.

The #1 reason management doesn’t get security is the sense that “It can’t happen to me” or “What, me worry?” The general argument goes – we are not involved in financial services or national defense. Why would anyone care about what I have? And in any case, even if they hack me, what would they get? It’s not even worth the bother. Larry Ponemon writing in the Harvard Business Review captures this sentiment.

Attackers are increasingly targeting small companies, planting malware that not only steals customer data and contact lists but also makes its way into the computer systems of other companies, such as vendors. Hackers might also be more interested in your employees than you’d think. Are your workers relatively affluent? If so, chances are the hackers are way ahead of you and are either looking for a way into your company, or are already inside, stealing employee data and passwords which (as they well know) people tend to reuse for all their online accounts.

Ponemon says “It’s literally true that no company is immune anymore. In a study we conducted in 2006, approximately 5% of all endpoints, such as desktops and laptops, were infected by previously undetected malware at any given time. In 2009—2010, the proportion was up to 35%. In a new study, it looks as though the figure is going to be close to 54%, and the array of infected devices is wider too, ranging from laptops to phones.”

In the recent revelations by Edward Snowden who blew the whistle on the NSA program called “Prism”, many prominent voices have said they are ok with the program and have nothing to hide. This is another aspect of “What, me worry?” Benjamin Franklin had it right many years ago, “Those who would give up essential liberty to purchase a little temporary safety deserve neither liberty nor safety.”

Learning from LeBron


Thinking about implementing analytics? Before you do that, ask yourself “What answers do I want from the data?”

After the Miami Heat lost the 2011 NBA playoffs to the Dallas Mavericks, many armchair MVPs were only too happy to explain that LeBron was not a clutch player and didn’t have what it takes to win championships in this league. Both LeBron and Coach Erik Spolestra however were determined to convert that loss into a teaching moment.

Analytics was indicated. But what was the question?  According to Spoelstra, “It took the ultimate failure in the Finals to view LeBron and our offense with a different lens. He was the most versatile player in the league. We had to figure out a way to use him in the most versatile of ways — in unconventional ways.” In the last game of the 2011 Finals, James was almost listlessly loitering beyond the arc, hesitating, shying away, and failing to take advantage of his stature. His last shot of those Finals was symbolic: an ill-fated 25-foot jump shot from the outskirts of the right wing — his favorite 3-point shot location that season.

LeBron decided the correct answer was to work on the post-up game during the off season. He spent a week learning from the great Hakeem Olajuwon. He brought his own videographer to record the sessions for later review. LeBron arrived early for each session and was stretched and ready to go every time. He took the lessons to the gym for the rest of the off season. It worked. James emerged from that summer transformed. “When he returned after the lockout, he was a totally different player,” Spoelstra says. “It was as if he downloaded a program with all of Olajuwon’s and Ewing’s post-up moves. I don’t know if I’ve seen a player improve that much in a specific area in one offseason. His improvement in that area alone transformed our offense to a championship level in 2012.”

The true test of analytics isn’t just on how good they are but in how committed you are in using the data. At the 2012 NBA Finals, LeBron won the MVP title and Miami, the championship.

The lesson to learn here is to know what answers you are seeking form the data and commit to going where the data takes you.

Using Dynamic Audit Policy to Detect Unauthorized File Access


One thing I always wished you could do in Windows auditing was mandate that access to an object be audited if the user was NOT a member of a specified group. Why? Well sometimes you have data that you know a given group of people will be accessing and for that activity you have no need of an audit trail. Let’s just say you know that members of the Engineering group will be accessing your Transmogrifier project folder and you do NOT need an audit trail for when they do. But this is very sensitive data and you DO need to know if anyone else looks at Transmogrifier.