The Algorithm Behind the Curtain: Reinforcement Learning Concepts (2 of 5)

Reinforcement Learning (RL) is at the heart of DeepMind’s Go playing machine. In the second article in this series, we’ll explain what RL is, and why it represents a break from mainstream machine learning.

In the first article in this series, we discussed why AlphaGo’s victory over world champ Lee Sedol in Go represented a major breakthrough for machine learning (ML). In this is article, we’ll dissect how reinforcement learning (RL) works. RL is one of the main components used in DeepMind’s AlphaGo program.

Reinforcement Learning Overview

Reinforcement learning is a subset of machine learning that has its roots in computer science techniques established in the mid-1950s. Although it has evolved significantly over the years, reinforcement learning hasn’t received as much attention as other types of ML until recently. To understand why RL is unique, it helps to know a bit more about the ML landscape in general.

Most machine learning methods used in business today are predictive in nature. That is, they attempt to understand complex patterns in data — patterns that humans can’t see — in order to predict future outcomes. The term “learning” in this type of machine learning refers to the fact that the more data the algorithm is fed, the better it is at identifying these invisible patterns, and the better it becomes at predicting future outcomes.

Read More

The Algorithm Behind the Curtain: How DeepMind Built a Machine that Beat a Go Master (1 of 5)

Machine learning’s victory in the game of Go is a major milestone in computer science. In the first article in this series, we’ll explain why, and start dissecting the algorithms that made it happen.

In March, an important milestone for machine learning was accomplished: a computer program called AlphaGo beat one of the best Go players in the world—Lee Sedol—four times in a five-game series. At first blush, this win may not seem all that significant. After all, machines have been using their growing computing power for years to beat humans at games, most notably in 1997 when IBM’s Deep Blue beat world champ Garry Kasparov at chess. So why is the AlphaGo victory such a big deal?

The answer is two-fold. First, Go is a much harder problem for computers to solve than other games due to the massive number of possible board configurations. Backgammon has 1020 different board configurations, Chess has 1043 and Go has a whopping 10170 configurations. 10170 is an insanely large number—too big for humans to truly comprehend. The best analogy used to describe 10170 is that it is larger than the number of atoms in the universe. The reason that the magnitude of 10170 is so important is because it implies that if machine learning (ML) can perform better than the best humans for a large problem like Go, then ML can solve a new set of real-world problems that are far more complex than previously thought possible. This means that the potential that machine learning will impact our day-to-day lives in the near future just got a lot bigger.

Read More

Customer Experience Eats Proximity Technology

Proximity technology alone won’t transform retail—it must be used to address customer need in the digital age.

Proximity technology is a class of emerging technologies (which includes iBeacon, NFC, RFID and a host of others) that enable marketers to pinpoint the location of a customer at a particular point in time. Although proximity technology holds vast potential for marketers, it raises some legitimate concerns as well. Probably the most famous (or infamous) example of the dark side of proximity marketing was in the movie “Minority Report,” which depicted a world where people are under constant surveillance, allowing governments and businesses to track people continuously via retina scanners. In this futuristic landscape, digital billboards identify customers as they pass by and speak to them with highly personalized marketing messages: “Hello Mr. Yakimoto, welcome back to the Gap. How did those tank tops work out for you?”

Fortunately for us, ubiquitous, government-controlled retina scanners don’t exist in the real world. But, an even more powerful and pervasive tracking device does — the smartphone. When paired with proximity technology, the smartphone provides all the computational horsepower necessary to create sci-fi-inspired personalized marketing experiences, experiences that truly add value for the customer rather than creating a dystopian landscape. So if that’s the case, why hasn’t proximity technology transformed retail?

Read More

Seven Practical Technology Leadership Principles

Being a great technologist requires very different skills than being a great technology leader. The key to making the transition is adopting the right mindset.

This article was published on 7/31/2017 in VentureBeat.

Technical managers are often promoted to their positions of leadership by rising through the ranks—more so than most other disciplines. This is a practical move considering that business decisions today increasingly hinge on the nuanced details of underlying technology. Technology leaders need to assess technical options, align recommendations with business requirements and communicate these decisions to non-technical stakeholders. If technology managers don’t understand the technology at a detailed level, it’s difficult for them to make the right call.

The challenge is that being a great engineer doesn’t automatically translate into being a great leader. Leadership—technical or otherwise—is not something one is born with; it is a skill that is developed over a lifetime. Unfortunately, many companies don’t have management training programs in place to cultivate leaders as they move up the org chart. And for those that do, these trainings are typically generic and conceptual. General management training is an important first step, but it is insufficient by itself to prepare technology leaders for the tactical challenges that await them on a day-to-day basis in their new role.

Read More

New Years Security Resolutions

Seven steps that will make this year the most secure year yet.

It’s the New Year, which means it’s time for the annual human ritual of making personal promises to give up bad habits and commit to living life better going forward. While most people are focused on renewing their gym memberships or cutting out carbs, my New Years resolution is to help make the Internet a safer place. As an industry, our collective security bad habits caught up with us last year, and it’s time for a change. Last year was a very bad year in terms of security. Here is but a small sampling of the headline-grabbing breaches that happened in 2015:

Read More

Three Reasons Why Docker is a Game Changer

Containers represent a fundamental evolution in software development, but not for the reasons most people think.

Docker’s rapid rise to prominence has put it on the radar of almost every technologist today, both IT professionals and developers alike. Docker containers hold the promise of providing the ideal packaging and deployment mechanism for microservices, a concept which has also experienced a growing surge in popularity.

But while the industry loves its sparkling new technologies, it is also deeply skeptical of them. Until a new technology has been battletested, it’s just an unproven idea with a hipster logo, so it’s not surprising that Docker is being evaluated with a critical eye—it should be.

To properly assess Docker’s utility, however, it’s necessary to follow container-based architecture to its logical conclusion. The benefits of isolation and portability, which get most of the attention, are reasons enough to adopt Docker containers. But the real game changer, I believe, is the deployment of containers in clusters. Container clusters managed by a framework like Google’s Kubernetes, allow for the true separation of application code and infrastructure, and enable highly resilient and elastic architectures

Read More

Strengthen Your AWS Security by Protecting App Credentials and Automating EC2 and IAM Key Rotation

Effective information security requires following strong security practices during development. Here are three ways to secure your build pipeline, and the source code to get you started.

One of the biggest headaches faced by developers and DevOps professionals is the problem of keeping the credentials used in application code secure. It’s just a fact of life. We have code that needs to access network resources like servers and databases, and we have to store these credentials somewhere. Even in the best of circumstances this is a difficult problem to solve, but the messy realities of daily life further compound the issue. Looming deadlines, sprawling technology and employee turnover all conspire against us when we try to keep the build pipeline secure. The result is “credential detritus”: passwords and security keys littered across developer workstations, source control repos, build servers and staging environments.

Use EC2 Instance Profiles

A good strategy for minimizing credential detritus is to reduce the number of credentials that need to be managed in the first place. One effective way to do this in AWS is by using EC2 Instance Profiles. An Instance Profile is an IAM Role that is assigned to an EC2 instance when it’s created. Once this is in-place, any code running on the EC2 instance that makes CLI or SDK calls to AWS resources will be made within the security context of the Instance Profile. This is extremely handy because it means that you don’t need to worry about getting credentials onto the instance when it’s created, and you don’t need to manage them on an ongoing basis—AWS automatically rotates the keys for you. Instead, you can spend your time fine tuning the security policy for the IAM Role to ensure that it has the least amount of privileges to get its job done.

Read More

Eight Reasons Why Agile Motivates Project Teams

Research proves what software developers already know: Agile projects are more fun and inspiring to work on. In this article, we review the science that explains why Agile fosters greater motivation.

A few weeks ago, I finished conducting a series of video retrospectives with several POP team members who recently completed Agile/Scrum projects. The goal of these one-on-one interviews was to elicit the kinds of critical insights that can only be discovered through in-the-trenches experience. By video recording the conversations, it allowed me to quickly distribute these Agile learnings to the larger agency in easy-to-digest bites.

It was great listening to the team talk about their Scrum experiences, but what struck me the most was the universal belief among the people I talked to that Agile projects were more fun and motivating than Waterfall projects. I wouldn’t have considered this a pattern if the people I interviewed had all worked on the same project. But the team members I spoke with worked on a variety of different projects, ranging from e-commerce sites, to mobile apps, to frontend-focused work. Furthermore, the participants came from different departments, including design, development, project management and QA. Yet despite these differences, it was clear that everyone I talked to shared one thing in common: they all had a much higher level of satisfaction and motivation when working on Agile projects. So for me the big question was: Why? What was it about Agile that fostered greater motivation and better performance than a typical Waterfall project?

Read More

Using Vagrant, Chef and IntelliJ to Automate the Creation of the Java Development Environment

The long path to DevOps enlightenment begins with the Developer’s IDE: Here’s how to get started on the journey. In this article we walk through the steps for automating the creation of a virtual development environment.

One of the challenges faced by software developers today working on cloud applications and distributed systems is the problem of setting up the developer workstation in a development environment comprised of an increasing number of services and technologies. It was already hard enough to configure developer workstations for complex monolithic applications, and now it’s even harder as we start to break down the application into multiple microservices and databases. If you are starting to feel like your developers’ workstations have become fragile beasts that are able to generate builds only by the grace of God and through years of mystery configuration settings, then you are facing trouble. Seek medical help immediately if you are experiencing any of the following symptoms:

  • The onboarding of new developers takes days or even weeks because getting a new development machine configured is a time-consuming and error-prone process.
  • The words “But the code works on my machine” are uttered frequently within your organization.
  • Bugs are often discovered in production that don’t occur in development or staging.
  • The documentation for deploying the application to production is a short text file with a last modified date that’s over a year old.
The good news is that there are technologies and practices to remedy these problems. The long-term cure for this affliction is cultivating a DevOps culture within your organization. DevOps is the new hybrid combination of software development and infrastructure operations. With the rise of virtualization and cloud-computing, these two formerly separate departments have found themselves bound together like conjoined twins. In the cloud, hardware is software, and thus software development now includes infrastructure management.

Read More

The cloud is more than just a new place to park your app: it’s a paradigm shift in how we build software

Cloud-computing makes possible a new breed of applications that are much more robust and highly tolerant to change. Here are 10 key architectural considerations when developing applications born in the cloud.

There was a time, back in the day, when life as a software architect was simple. Release cycles were measured in half-years and quarters. Application traffic was significant, but not insane. Data was always persisted in a centralized, relational database (because that was the only option). And best of all, the applications themselves were hosted on high-end, company-owned hardware managed by a separate operations team. Life back in the day was good.

But then the Internet kept growing.

Release cycles got shorter as the business fought to remain competitive. Traffic continued to grow and huge spikes could happen at any time. The relational database was coming apart at the seams, no matter how much iron was thrown at it. And in the midst of it all, everyone started talking incessantly about this new thing called “the cloud” that was supposed to fix everything. The brief period of easy living had come to end.

Read More

Proximity marketing has arrived. Here’s the blueprint for creating a one-to-one digital conversation with your shopper in-store today.

Emerging technologies like iBeacon and Near Field Communication (NFC) have opened up the possibilities for unparalleled in-store interactivity with shoppers. The key is staying focused on using this new tech to actually enhance the shopping experience for the customer.

Emerging in-store positioning technologies like iBeacon hold the promise for highly-personalized, “Minority-Report-like,” marketing programs. However, this technology is still at a very early stage. Retailers who adopt the technology first—and are able to execute it brilliantly—will almost certainly gain a competitive advantage. But the challenge is that it’s not entirely clear what experiences can be created today that actually offers a better shopping experience. Much of what the industry is talking about now centers around using proximity technology to offer coupons to shoppers in-store. I, for one, think we can do a lot better than incessantly pushing discounts to shoppers as they peruse the aisle.

At POP, the innovation team wanted to weed out the hype from the reality by building a real, working prototype using today’s technology to create an in-store shopping experience that didn’t suck. We wanted to build something that added value to the shopping experience for the customer and promoted stronger sales for the retailer.

Read More

Six Practical Steps You Should Take to Protect Yourself from Cyber Criminals

By dissecting the methods used by hackers in the recent wave cyber attacks, we can identify ways to help us stay more secure online.

A rash of cyber attacks and security news hit over the Labor Day weekend, impacting The Home Depot, Healthcare.gov, Goodwill and Apple. But at least this recent flurry of security activity is positive in one respect: it gives us a glimpse into the mechanics of real world attack scenarios.  The more we can use this as a learning opportunity, the safer we’ll be. Here are a few lessons we should take away from the attacks:

1. Understand that even if you do everything right, you’re still not safe

During the first few days of the September iCloud breach, in which explicit pictures of several celebrities were hacked via Apple’s iCloud backup service, many people were saying that the victims should have used two-factor authentication to protect their information (sadly, another example the “blame the victim” mentality). It was later disclosed, however, that Apple’s two-factor authentication didn’t actually cover iCloud backups. So, even if you are one of the rare, paranoid people who use two-factor authentication, it wouldn’t have protected you.

In a similar vein, having the most secure password in the world, wouldn’t have helped the customers of Home Depot or Goodwill, who’s stolen credits cards were used in-store. If the people processing your credit cards get hacked, no amount of cyber protection will save you.

Read More