Building Serverless Apps with AWS, Lambda, Python & Zappa

Popular open source Python framework Zappa eases the pain of serverless development

Serverless architecture has quickly emerged as the up-and-coming way to develop cloud-based applications. The benefits of serverless computing are just too good ignore: zero server maintenance, out-of-the-box scalability and ground-up support for asynchronous workflows. It’s a developer’s dream. But serverless architecture comes with a catch. These significant operational benefits cost developers a hefty price: laborious configuration, immature frameworks, lack of tooling and a dearth of established design patterns. Until very recently, going serverless basically meant throwing out decades of established tools and practices, and starting over. In a nutshell, making serverless apps can be painful.

The pain, however, is slowly going away. It began with Amazon Web Services (AWS) releasing the Serverless Application Model (SAM). SAM is essentially a simplified template syntax for CloudFormation (AWS’s infrastructure-as-code framework) that reduced the jaw-dropping amount of code previously required to automate the deployment of serverless apps. SAM made configuration a little better for some use-cases, but it was only a small incremental improvement. The fundamental problem still remained: developing applications as Lambda functions required starting over with new design patterns and Lambda-compatible libraries. For the Python crowd, this meant kissing goodbye time-saving application frameworks like Django and Flask.

Read More

Marketing and AI: Separating Fact from Fiction

How marketers can avoid getting duped by ‘faux AI’ vendors

This article was published on 4/3/2019 in VentureBeat.

As a software engineer whose clients are marketing professionals, I’ve gained a great deal of empathy for marketers over the years. Not because marketing technology is evolving so rapidly (which it is), but because the proliferation and misuse of buzzwords is so rampant, with the term “AI” leading the way. This is unfortunate because AI — or more accurately, machine learning (ML), a subset of AI — has huge potential for marketers. Making matters worse, when vendors combine buzzwords like AI with other buzzy, ill-defined technologies like “marketing automation,” the result is a buzzword soup of confusion. How can marketers separate reality from the fake marketing news?

It’s important for marketers to gain a high-level understanding of how ML works. If you don’t understand the basic concepts, it’s much easier to be taken for a ride. For ML newbies, I recommend Microsoft’s Data Science for Beginners. This video series provides an excellent non-technical overview of ML.

Read More

AI-Enabled Personalization is Easier Than You Think

Four ways to start incorporating AI and personalization into your marketing today

This article was published on 10/23/2017 in VentureBeat.

Since the invention of mass media, arguably, the primary focus of marketing has been to increase its level of personalization. Marketers constantly seek more targeted audiences, and strive to deliver messages that speak more directly to their diverse audiences. So, it's no surprise that AI and machine learning — with their ability to predict consumer behavior and make personalized recommendations on-the-fly — has captured the attention of the marketing world. But the elephant in the room is that advances in machine learning have far outpaced most marketers' ability to harness them

Unfortunately, this inability to personalize the customer experience is a huge missed opportunity. Customers now expect a tailored experience, including customized recommendations and a personal touch. And customers are willing to reward companies who provide it. According to Gartner, "By 2018, organizations that have fully invested in all types of personalization will outsell companies that have not by 20%." Even more alarming, customers are increasingly likely to dump brands that don't offer personalization. According to a 2016 Salesforce study: "… more than half (52%) of consumers are likely to switch brands if a company doesn't make an effort to personalize communications to them, 65% of business buyers say the same about vendor relationships."

Read More

Machine Learning Comes to the Masses

How a new wave of machine learning will impact today’s enterprise

This article was published on 7/17/2017 in VentureBeat.

Advances in deep learning and other machine learning algorithms are currently causing a tectonic shift in the technology landscape. Technology behemoths like Google, Microsoft, Amazon, Facebook and Salesforce are engaged in an artificial intelligence (AI) arms race, gobbling up machine learning talent and start-ups at an alarming pace. They are building AI technology war chests in an effort to develop an insurmountable competitive advantage.

While AI and machine learning are not new, the current momentum behind AI is distinctly different today, for several reasons. First, advances in computing technology (GPU chips and cloud computing, in particular) are enabling engineers to solve problems in ways that weren’t possible before. These advances have a broader impact than just the development of faster, cheaper processors, however. The low cost of computation and the ease of accessing cloud-managed clusters have democratized AI in a way that we’ve never seen before. In the past, building a computer cluster to train a deep neural network would have required access to deep pockets or a university research facility. You would have also needed someone with a Ph.D. in mathematics who could understand the academic research papers on subjects like convolutional neural networks.

Read More

AWS Serverless Architecture In Practice

Five key takeaways for designing, building and deploying serverless applications in the real world

This article was published on 4/3/2017 in VentureBeat.

The term “serverless architecture” is a recent addition to the technology lexicon, coming into common use within the last year or so, following the launch of AWS Lambda in 2014. The term is both quizzical and provocative. Case in point: while explaining the concept of serverless architecture to a seasoned systems engineer recently, he literally stopped me mid-sentence—worried that I had gone insane—and asked: “You realize there is actual hardware up there in the cloud, right?” Not wanting to sound crazy, I said yes. But secretly I thought to myself: “Yet, if my team doesn’t have to worry about server failures, then for all practical purposes, hardware doesn’t exist in the cloud—it might as well be unicorn fairy dust.” And that, in a nutshell, is the appeal of serverless architecture: the ability to write code on clouds of cotton candy, without a concern for the dark dungeons of server administration.

But is the reality as sweet as the magical promise? At POP, we put this question to the test when we recently deployed an app in production utilizing a serverless architecture for one of our clients. However, before we review the results, let’s dissect what serverless architecture is.

Read More

The Sisyphean Challenge of Senior Technology Leadership

Technology managers find themselves between a rock and a hard place, forced to choose between focusing on technical depth or leadership excellence. A potential solution comes from an unlikely source.

This article was published on 11/6/2016 in VentureBeat.

A workplace dynamic I’ve always found fascinating is the instinctual need for people to size up the technical depth of a technology leader upon first introduction. The hands-on technologists in the room want to determine if the manager understands what they do on a day-to-day basis. The non-technical people want to asses if she’ll be able to communicate clearly, or if she speaks in technical gibberish.

This social dynamic is a natural side effect of the dual nature of the senior technology leadership role. On the one hand, technology managers must create and operate code and infrastructure, which requires detailed, technical knowledge. On the other hand, they must translate technical concepts into business strategy and manage a team, which requires communication and leadership skills.

The challenge for senior technology leaders is that we can’t do both perfectly. Therefore, the goal of the CTO and other senior technology leaders is to strike the right balance between technical depth and business leadership, based on the size and focus of the company. However, this is easier said that done.

Read More

The Algorithm Behind the Curtain: Building an Artificial Brain with Neural Networks (4 of 5)

Neural Networks form the foundation for Deep Learning, the technique AlphaGo used with Reinforcement Learning (RL) to beat a Go master. In this article, we’ll explain how the basics of neural networks work.

The focus of this series is to dissect the methods used by DeepMind to develop AlphaGo, the machine learning program that shocked the world by defeating a worldwide Go master. By peeking under the hood of DeepMind’s algorithm, we hope to demystify Machine Learning (ML) and help people understand that ML is merely a computational tool, not a dark art destined to bring about the robot apocalypse. In the earlier articles we discussed why AlphaGo’s victory represents a breakthrough, and we explained the concepts and algorithms behind reinforcement learning—a key component of DeepMind’s program. In this article, we’ll explore artificial neural networks. Neural networks form the foundation of deep learning, the technique that enabled DeepMind’s reinforcement learning algorithm to solve extremely large and complex problems like Go. Deep learning is an advanced form an artificial neural network. So, before we dive into deep learning in the next article, we’ll first explore how a neural network operates.

Read More

The Algorithm Behind the Curtain: Understanding How Machines Learn with Q-Learning (3 of 5)

Reinforcement Learning (RL) is the driving algorithm behind AlphaGo, the machine the beat a Go master. In this article, we explore how the components of an RL system come together in an algorithm that is able to learn.

Our goal in this series is to gain a better understanding of how DeepMind constructed a learning machine — AlphaGo — that was able beat a worldwide Go master. In the first article, we discussed why AlphaGo’s victory represents a breakthrough in computer science. In the the second article, we attempted to demystify machine learning (ML) in general, and reinforcement learning (RL) in particular, by providing a 10,000-foot view of traditional ML and unpacking the main components of an RL system. We discussed how RL agents operate in a flowchart-like world represented by a Markov Decision Process (MDP), and how they seek to optimize their decisions by determining which action in any given state yields the most cumulative future reward. We also defined two important functions, the state-value function (represented mathematically as V) and the action-value function (represented as Q), that RL agents use to guide their actions. In this article, we’ll put all the pieces together to explain how a self-learning algorithm works.

Read More

The Algorithm Behind the Curtain: Reinforcement Learning Concepts (2 of 5)

Reinforcement Learning (RL) is at the heart of DeepMind’s Go playing machine. In the second article in this series, we’ll explain what RL is, and why it represents a break from mainstream machine learning.

In the first article in this series, we discussed why AlphaGo’s victory over world champ Lee Sedol in Go represented a major breakthrough for machine learning (ML). In this is article, we’ll dissect how reinforcement learning (RL) works. RL is one of the main components used in DeepMind’s AlphaGo program.

Reinforcement Learning Overview

Reinforcement learning is a subset of machine learning that has its roots in computer science techniques established in the mid-1950s. Although it has evolved significantly over the years, reinforcement learning hasn’t received as much attention as other types of ML until recently. To understand why RL is unique, it helps to know a bit more about the ML landscape in general.

Most machine learning methods used in business today are predictive in nature. That is, they attempt to understand complex patterns in data — patterns that humans can’t see — in order to predict future outcomes. The term “learning” in this type of machine learning refers to the fact that the more data the algorithm is fed, the better it is at identifying these invisible patterns, and the better it becomes at predicting future outcomes.

Read More

The Algorithm Behind the Curtain: How DeepMind Built a Machine that Beat a Go Master (1 of 5)

Machine learning’s victory in the game of Go is a major milestone in computer science. In the first article in this series, we’ll explain why, and start dissecting the algorithms that made it happen.

In March, an important milestone for machine learning was accomplished: a computer program called AlphaGo beat one of the best Go players in the world—Lee Sedol—four times in a five-game series. At first blush, this win may not seem all that significant. After all, machines have been using their growing computing power for years to beat humans at games, most notably in 1997 when IBM’s Deep Blue beat world champ Garry Kasparov at chess. So why is the AlphaGo victory such a big deal?

The answer is two-fold. First, Go is a much harder problem for computers to solve than other games due to the massive number of possible board configurations. Backgammon has 1020 different board configurations, Chess has 1043 and Go has a whopping 10170 configurations. 10170 is an insanely large number—too big for humans to truly comprehend. The best analogy used to describe 10170 is that it is larger than the number of atoms in the universe. The reason that the magnitude of 10170 is so important is because it implies that if machine learning (ML) can perform better than the best humans for a large problem like Go, then ML can solve a new set of real-world problems that are far more complex than previously thought possible. This means that the potential that machine learning will impact our day-to-day lives in the near future just got a lot bigger.

Read More