New Years Security Resolutions

Seven steps that will make this year the most secure year yet.

It’s the New Year, which means it’s time for the annual human ritual of making personal promises to give up bad habits and commit to living life better going forward. While most people are focused on renewing their gym memberships or cutting out carbs, my New Years resolution is to help make the Internet a safer place. As an industry, our collective security bad habits caught up with us last year, and it’s time for a change. Last year was a very bad year in terms of security. Here is but a small sampling of the headline-grabbing breaches that happened in 2015:

Read More

Three Reasons Why Docker is a Game Changer

Containers represent a fundamental evolution in software development, but not for the reasons most people think.

Docker’s rapid rise to prominence has put it on the radar of almost every technologist today, both IT professionals and developers alike. Docker containers hold the promise of providing the ideal packaging and deployment mechanism for microservices, a concept which has also experienced a growing surge in popularity.

But while the industry loves its sparkling new technologies, it is also deeply skeptical of them. Until a new technology has been battletested, it’s just an unproven idea with a hipster logo, so it’s not surprising that Docker is being evaluated with a critical eye—it should be.

To properly assess Docker’s utility, however, it’s necessary to follow container-based architecture to its logical conclusion. The benefits of isolation and portability, which get most of the attention, are reasons enough to adopt Docker containers. But the real game changer, I believe, is the deployment of containers in clusters. Container clusters managed by a framework like Google’s Kubernetes, allow for the true separation of application code and infrastructure, and enable highly resilient and elastic architectures

Read More

Strengthen Your AWS Security by Protecting App Credentials and Automating EC2 and IAM Key Rotation

Effective information security requires following strong security practices during development. Here are three ways to secure your build pipeline, and the source code to get you started.

One of the biggest headaches faced by developers and DevOps professionals is the problem of keeping the credentials used in application code secure. It’s just a fact of life. We have code that needs to access network resources like servers and databases, and we have to store these credentials somewhere. Even in the best of circumstances this is a difficult problem to solve, but the messy realities of daily life further compound the issue. Looming deadlines, sprawling technology and employee turnover all conspire against us when we try to keep the build pipeline secure. The result is “credential detritus”: passwords and security keys littered across developer workstations, source control repos, build servers and staging environments.

Use EC2 Instance Profiles

A good strategy for minimizing credential detritus is to reduce the number of credentials that need to be managed in the first place. One effective way to do this in AWS is by using EC2 Instance Profiles. An Instance Profile is an IAM Role that is assigned to an EC2 instance when it’s created. Once this is in-place, any code running on the EC2 instance that makes CLI or SDK calls to AWS resources will be made within the security context of the Instance Profile. This is extremely handy because it means that you don’t need to worry about getting credentials onto the instance when it’s created, and you don’t need to manage them on an ongoing basis—AWS automatically rotates the keys for you. Instead, you can spend your time fine tuning the security policy for the IAM Role to ensure that it has the least amount of privileges to get its job done.

Read More

Eight Reasons Why Agile Motivates Project Teams

Research proves what software developers already know: Agile projects are more fun and inspiring to work on. In this article, we review the science that explains why Agile fosters greater motivation.

A few weeks ago, I finished conducting a series of video retrospectives with several POP team members who recently completed Agile/Scrum projects. The goal of these one-on-one interviews was to elicit the kinds of critical insights that can only be discovered through in-the-trenches experience. By video recording the conversations, it allowed me to quickly distribute these Agile learnings to the larger agency in easy-to-digest bites.

It was great listening to the team talk about their Scrum experiences, but what struck me the most was the universal belief among the people I talked to that Agile projects were more fun and motivating than Waterfall projects. I wouldn’t have considered this a pattern if the people I interviewed had all worked on the same project. But the team members I spoke with worked on a variety of different projects, ranging from e-commerce sites, to mobile apps, to frontend-focused work. Furthermore, the participants came from different departments, including design, development, project management and QA. Yet despite these differences, it was clear that everyone I talked to shared one thing in common: they all had a much higher level of satisfaction and motivation when working on Agile projects. So for me the big question was: Why? What was it about Agile that fostered greater motivation and better performance than a typical Waterfall project?

Read More

Using Vagrant, Chef and IntelliJ to Automate the Creation of the Java Development Environment

The long path to DevOps enlightenment begins with the Developer’s IDE: Here’s how to get started on the journey. In this article we walk through the steps for automating the creation of a virtual development environment.

One of the challenges faced by software developers today working on cloud applications and distributed systems is the problem of setting up the developer workstation in a development environment comprised of an increasing number of services and technologies. It was already hard enough to configure developer workstations for complex monolithic applications, and now it’s even harder as we start to break down the application into multiple microservices and databases. If you are starting to feel like your developers’ workstations have become fragile beasts that are able to generate builds only by the grace of God and through years of mystery configuration settings, then you are facing trouble. Seek medical help immediately if you are experiencing any of the following symptoms:

  • The onboarding of new developers takes days or even weeks because getting a new development machine configured is a time-consuming and error-prone process.
  • The words “But the code works on my machine” are uttered frequently within your organization.
  • Bugs are often discovered in production that don’t occur in development or staging.
  • The documentation for deploying the application to production is a short text file with a last modified date that’s over a year old.
The good news is that there are technologies and practices to remedy these problems. The long-term cure for this affliction is cultivating a DevOps culture within your organization. DevOps is the new hybrid combination of software development and infrastructure operations. With the rise of virtualization and cloud-computing, these two formerly separate departments have found themselves bound together like conjoined twins. In the cloud, hardware is software, and thus software development now includes infrastructure management.

Read More

The cloud is more than just a new place to park your app: it’s a paradigm shift in how we build software

Cloud-computing makes possible a new breed of applications that are much more robust and highly tolerant to change. Here are 10 key architectural considerations when developing applications born in the cloud.

There was a time, back in the day, when life as a software architect was simple. Release cycles were measured in half-years and quarters. Application traffic was significant, but not insane. Data was always persisted in a centralized, relational database (because that was the only option). And best of all, the applications themselves were hosted on high-end, company-owned hardware managed by a separate operations team. Life back in the day was good.

But then the Internet kept growing.

Release cycles got shorter as the business fought to remain competitive. Traffic continued to grow and huge spikes could happen at any time. The relational database was coming apart at the seams, no matter how much iron was thrown at it. And in the midst of it all, everyone started talking incessantly about this new thing called “the cloud” that was supposed to fix everything. The brief period of easy living had come to end.

Read More

Proximity marketing has arrived. Here’s the blueprint for creating a one-to-one digital conversation with your shopper in-store today.

Emerging technologies like iBeacon and Near Field Communication (NFC) have opened up the possibilities for unparalleled in-store interactivity with shoppers. The key is staying focused on using this new tech to actually enhance the shopping experience for the customer.

Emerging in-store positioning technologies like iBeacon hold the promise for highly-personalized, “Minority-Report-like,” marketing programs. However, this technology is still at a very early stage. Retailers who adopt the technology first—and are able to execute it brilliantly—will almost certainly gain a competitive advantage. But the challenge is that it’s not entirely clear what experiences can be created today that actually offers a better shopping experience. Much of what the industry is talking about now centers around using proximity technology to offer coupons to shoppers in-store. I, for one, think we can do a lot better than incessantly pushing discounts to shoppers as they peruse the aisle.

At POP, the innovation team wanted to weed out the hype from the reality by building a real, working prototype using today’s technology to create an in-store shopping experience that didn’t suck. We wanted to build something that added value to the shopping experience for the customer and promoted stronger sales for the retailer.

Read More

Six Practical Steps You Should Take to Protect Yourself from Cyber Criminals

By dissecting the methods used by hackers in the recent wave cyber attacks, we can identify ways to help us stay more secure online.

A rash of cyber attacks and security news hit over the Labor Day weekend, impacting The Home Depot, Healthcare.gov, Goodwill and Apple. But at least this recent flurry of security activity is positive in one respect: it gives us a glimpse into the mechanics of real world attack scenarios.  The more we can use this as a learning opportunity, the safer we’ll be. Here are a few lessons we should take away from the attacks:

1. Understand that even if you do everything right, you’re still not safe

During the first few days of the September iCloud breach, in which explicit pictures of several celebrities were hacked via Apple’s iCloud backup service, many people were saying that the victims should have used two-factor authentication to protect their information (sadly, another example the “blame the victim” mentality). It was later disclosed, however, that Apple’s two-factor authentication didn’t actually cover iCloud backups. So, even if you are one of the rare, paranoid people who use two-factor authentication, it wouldn’t have protected you.

In a similar vein, having the most secure password in the world, wouldn’t have helped the customers of Home Depot or Goodwill, who’s stolen credits cards were used in-store. If the people processing your credit cards get hacked, no amount of cyber protection will save you.

Read More

Ukrainian Hacker Strikes Again. Creepy Hacker Community Compromises Apple iCloud.

A wave of high profile security breaches was recently discovered, potentially affecting millions of people. Each attack had a unique footprint, giving us an interesting glimpse into the scary world of cyber crime.

Somewhere in the PR offices of the Goodwill, the Department of Health and Human Services, and The Home Depot, a crisis-management specialist is enjoying a small moment of thanks. On the one hand, they’ve probably had a pretty terrible week, dealing with the press and trying to explain the causes and impacts of major security breaches within their organizations. On the other hand, they are probably considering themselves lucky. They know that the best way to divert attention away from their own crises is for another, more interesting crisis to hit at the same time.  Fortunately for them, their unspoken prayers were answered. At the same time stories broke about their breaches, it was revealed that naked photographs of high profile, female celebrities were stolen from Apple’s iCloud service.  Hacking + Apple + celebrities + naked selfies = a four-of-a-kind in the tech news world, and trumps even news about a security breach that might be bigger than Target’s 2013 attack. Let’s face it, Jennifer Lawrence has a lot more charisma than Home Depot credit card numbers.

Although this string of hacks might have been an unexpected deus ex machina for a few lucky PR professionals, for the rest of us, it’s a really scary series of events that forces us to take a step back and ask the question: is anything safe online? Let’s review each of these breaches and see what we can learn from them so we can be better protected ourselves in cyber space.

Read More

Self-Organizing Kilobots Attack!

Harvard University recently developed swarm-intelligent micro-bots that can self-organize and accomplish simple tasks. This is a great illustration of the possibilities of emergent phenomenon.

Harvard researchers developed a system of 1,024 micro-robots that move using vibration and can self-organize to accomplish simple tasks, like forming the shape of a wrench or a star. The swarm system is based on biological systems (like ants!) who display complex behavior by following a handful of simple rules. The feat was considered a breakthrough due to the large number of bots in the swarm. Previous micro-bot swarms were less than 100.

Read More

CIA’s Top Security Innovator Proposes Some Ideas That Are Crazy Enough to Work

Dan Geer, the top security chief at the CIA’s VC firm In-Q-Tel, gave a thought provoking keynote at this year’s Black Hat security conference, arguing that thoughtful government regulation was the best hope for shoring up our cyber defense. He may just be right.

The Iconoclast

Dan Geer has never been one to walk away from a fight. In 2003, he was fired from security firm @Stake after authoring a report released by the Computer and Communications Industry Association arguing that Microsoft’s monopoly over of the desktop was a national security threat. Given that Microsoft was a client of @Stake at the time, it’s not a shocker that he didn’t make employee of the month. Somewhat humorously, in an interview with Computerworld after the incident, Dan remarked, “It’s not as if there’s a procedure to check everything with marketing.”  Somehow I think a guy with degrees from MIT and Harvard didn’t need to check-in with marketing to gauge what his firm’s reaction to the paper would be.

Fortunately for the Black Hat audience (and those of us who watched the presentation online), Dan continued to live up to his reputation. He outlined a 10-point policy recommendation (well summarized here) for improving cyber security. In the preamble leading up to the policy recommendations, he made two key points that provide critical support for his policy argument:

  1. The pace of technology change is happening so quickly now that security generalists can no longer keep up. Highly specialized security experts and governments are now needed to protect our information assets.
  1. If you want to increase information security, you have to be pragmatic and willing to make compromises. As Dan succinctly put it: “In nothing else is it more apt to say that our choices are Freedom, Security, Convenience—Choose Two.”

These points are important to keep in mind when listening to his presentation because they provide critical context for his potentially unpalatable policy recommendations.

Read More

Traditional Project Management is 100 Years Old. It’s Time to Upgrade.

Project management as it’s practiced today is a throwback from the industrial revolution and it hinders innovation in today’s fast-paced, digitally-disruptive world. Agile project management is its logical successor, but managers need to embrace it as more than just a software methodology.

This is the third article in a 3-part series:
1. Is Your Company Operating from an Industrial-Era Playbook?
2. Why Performance-Based Compensation Doesn't Work
3. Traditional Project Management Needs and Upgrade (This article)

Don’t worry—we’ve all done it. If fact, most of us are still are doing it. Actually, most of us are doing it and still think it’s okay to do it.

No, I’m not talking about sneaking in a little TMZ while we’re at work. I’m talking about using Microsoft Project or Excel to make a project plan—something far worse for productivity than the worker time lost by following the latest celebrity break-ups.

Okay, I admit it: I use Microsoft Project Gantt charts at POP for planning small internal projects. And this isn’t really a problem because the time horizon for these projects is short, the complexity manageable, the impact of delays relatively minor, and the amount of uncertainty fairly limited. In short, it’s a simple tool for a simple problem.

But what happens when the project gets more complicated? When the environment in which the product operates is constantly changing? When deliverables are complex and require significant collaboration across teams and partners? When money is on the line and people’s careers hang in the balance? That’s when the Gantt chart starts to break down.

Read More