Self-Serve is a Great Model for Vendors – and Customers

Let’s get it out there right away. Self-serve is an effective, cost-saving strategy for software vendors. While some critics see the move to self-serve as a cynical attempt to offload costs (time, expense) to customers. But there’s also a huge benefit to customers – if done correctly. Self-serve can allow customers to access training and get answers when they need it.

There’s a precedent for this. For example, think back to the 1970s and 1980s when corporations offloaded all the clerical work formerly performed by legions of administrative personnel onto the workloads of pretty much everyone left after the cuts. It was essentially self-serve enabled by advances in technology. And while workers reacted negatively to losing secretarial assistance for report generation and letter-writing, the adoption of new office productivity technologies that accelerated in the ensuing years validated the decision to expect employees to assume more responsibility. It took time but eventually, we all became expert email and excel users.

Similarly, the mathematical cost calculation for providing a self-serve portal to customers is so clear it’s exhausting to imagine disputing the strategy. The industry has moved definitively in that direction. And in case it isn’t obvious, Technology Services Industry Association (TSIA) in its recent Technology Services Heatmap produced a table showing the adoption of 42 technologies employed in the post-sales world of enterprise software. Self-serve is one of only four technologies deployed in more than 75% of enterprise companies.

2018 TSIA Technology Heatmap

Peak self-serve? Perhaps but don’t be fooled into thinking the bargain is all in favor of companies. Customers enjoy significant benefits too, as highlighted in this article in the Harvard Business Review. It offers them convenience and speed and in this age of choice those attributes can help address imbalances presented by the eternal yin and yang of SaaS, retention, and churn. Both customers and vendors benefit from self-serve because it places the concept of customer value front and center. Vendors commit to providing what customers need and customers commit to doing many things for themselves as long as they continue to receive, through the portal, what they need.

Upon deeper consideration, this matter of evaluating self-serve through the lens of customer value is actually the more important factor of the self-serve equation and smart companies recognize this. They recognize that they need to invest in a comprehensive strategy that forces them to carefully tend to their side of the equation. If they hope to maintain the ROI of the technology, links will need to always work, documentation will need to be current, relevant, and acutely accurate, and interactive features will need to be, well, interactive. Furthermore, smart companies are using self-serve to better understand their customers through the ability to measure engagement and interaction and, in return, they use the knowledge gained to turn around and more personally nurture their relationships with customers.

Self-serve is a critical, and modern, customer-enabling strategy and it constitutes a major plank in the platform of our new service model announced on May 7.

In the end, smart companies know that the question of helping customers achieve their expected business outcomes isn’t really about what those customers want. Smart companies know the more relevant question revolves around what customers need and to answer that, self-serve removes a lot of the guesswork by opening a window and letting the customers in.

Oracle Blogs | Oracle Marketing Cloud

Are People Watching Your Landing Page Videos? Here’s How to use Google Tag Manager to Check

In 2018, video marketing has become ubiquitous in news feeds and it’s one of the best tools for persuasion you have available to you. In fact, 72% of businesses say video has improved their conversion rates. Naturally, because your landing pages are designed to persuade and convert, it makes total sense you’d want to use videos to boost the power of your offer.

But how do you know if visitors are actually interacting with your landing page videos? If you’re spending money on producing video content (especially if it’s offer-specific), you’ll want to know if your target audience is engaging.

While some of you may have access to a video marketing platform and resulting analytics, this post is going to share how you can get view information for YouTube video players using the free tool Google Tag Manager.

Once you follow the steps below for your Unbounce pages, you’ll be able to see:

  • If visitors are actually watching the videos on your landing pages
  • The duration of how long visitors are watching for, and
  • Where visitors are dropping off (this can help you understand what content to modify to keep visitors engaged).

First up: Add Google Tag Manager to Track Your Landing Page Videos

This is really easy to do in Unbounce. First:

  1. Head to the Script Manager under your Settings tab.
  2. Then, click the green “Add a Script” button.
  3. Next, select the Google Tag Manager option.
  4. Assuming you’ve already signed up for Google Tag Manager, you can add your Container ID.

Set up of Google Tag Manager

Lastly, attach your domain to the script, and you’re all set!

Once you have the script saved, use Google Tag Assistant to confirm the tag is working. After setting up this Tag Manager, next we’ll want to define how we want to track user interactions with our YouTube embeds, which brings us to…

Create Tags to Track Video Engagement

On September 12, 2017, Google Tag Manager released the YouTube Video Trigger which finally gave marketers the opportunity to track engagement from embedded YouTube videos within Google Analytics. Tag Manager added built-in video variables, and we want to confirm they are selected before creating any tags or triggers.

When you get to the Variables page in Google Tag Manager:

  • click on the red Configure button, and simply check the boxes for all the video variables, as seen in the image below:

Configuring Built in Variables

Next, we can create our trigger. Triggers control how the tag will be fired. The only option we need is the YouTube Video trigger type.

From here you can select the specific information you want to capture. These actions include when a user starts a video, completes a video, pause/seeking/buffering, and the duration of how much of the content they actually watch.

See how people are engaging (or not) with landing page videos

In the image above, we see just one option of a trigger you can create. If you choose to select ‘Progress’, you have to choose either Percentages or Time Thresholds. It has to be one or the other. You can’t do both. Using Percentages, you can add any number you like (i.e. it doesn’t have to be the numbers I used in the example above). Tag Manager will automatically add 100 for a completion.

On the other hand, if you choose ‘Time thresholds’, you will add the numbers (in seconds) you’d like to have recorded in Google Analytics. If your campaign focus is on views, I’d stick with Percentages. But, if you want to see where users are dropping off to help you improve the content of your videos, Time Thresholds is a good choice.

Lastly, choose when the trigger will fire. By default Tag Manager will fire the trigger on all videos, but you can choose to fire on only some videos.

You can also make your video triggers a lot more specific. The image below shows several options you have to fire the tag on a variety of custom variables for your YouTube videos. If you only want to track videos on certain landing pages, you can do that, but if you only want to track certain videos no matter what the landing page is, you have that option too. Create the trigger which will give you the data you need to make better decisions about the videos on your landing pages.

Now let’s set up the tag!

The image below is just one example of a completed tag set up. Here, you can change the Category, Action, and Label to capture the appropriate video data you want to collect. You can also research and find some cool custom versions of these tags like Simo Ahava’s YouTube Video Trigger. There are many options out there, so find the tag which works best for you.

Now that we can track the YouTube video interactions, let’s view the data.

View the Events Report in Google Analytics

In Google Analytics, head over to Behavior > Events. In the Overview or Top Events sections, you can see the Event Category lists of whatever you are tracking. While Event Category is the default view, you can switch to Event Action or Event Label to get deeper data depending on how you set up your tag.

So, how do you relate YouTube video tracking with our landing pages? Easy. Click on Secondary dimension, search for “landing pages” and select it. From here you’ll be able to see the page URL path alongside the current view you have pulled up.

We now have the data in Google Analytics to view which videos users interact with the most, how long are users watching the embedded YouTube videos, and which landing pages are actually seeing video engagement.

Now You Have Data to Improve the Videos on Your Landing Pages

If you find visitors barely watch your videos (think viewing less than 30% of the content), you now have data to push your team to modify the length of the videos, for example, or get to your key message differently (perhaps you have a really long intro?).

If the data shows users aren’t watching your videos at all, you may want to replace the video on your landing page with other, more customized options, or even text that sums up the value props presented. Finally, if you identify really popular videos, it could be a great opportunity to determine if there are opportunities for reuse on other relevant pages, too.

Overall, you won’t know whether page visitors resonate with the videos on your landing pages unless you track this. Let me know in the comments below if you have any questions on the setup above – happy to jump in with answers.


How to Make Your Unbounce Landing Pages GDPR Compliant

You might not wake up each morning thinking about data privacy and security but, like it or not, Facebook’s recent move makes it an issue you can’t dismiss. Long before Mark Zuckerberg sat before congress in the face of the Cambridge Analytica scandal, explaining how Facebook uses personal data, the European Union started getting especially serious about data protection and privacy.

And so, on May 25 2018, the EU’s General Data Protection Regulation (GDPR) goes into effect.

In a nutshell, the GDPR legislation gives everyone in the EU greater privacy rights, and introduces new rules for marketers and software providers to follow when it comes to collecting, tracking, or handling EU-based prospects’ and customers’ personal data.

Moreover, the GDPR applies to anyone who processes or stores data of those in the EU (i.e. you don’t need to be physically located in Europe for this to apply to your business and can incur fines up to 4% of your annual global turnover or €20 million [whichever is greater] for non-compliance).

But Beyond Potential Fines, Here’s Why You Need to Care

On Tuesday April 3rd, Zuckerberg said that Facebook had no plans to extend the GDPR regulations globally to all Facebook users. But, fast-forward a few weeks later and Facebook completely changed its tune, now planning to extend Europe’s GDPR standards worldwide.

This move sets a precedent, showing all of us that no matter where we are in the world, personal data and privacy laws aren’t optional. Compliance is table stakes.

If you’re located in Europe, process lead and customer data from Europe — or just happen to believe in high standards for data privacy and security, this post will help you navigate:

  • What Unbounce has done to become GDPR compliant, and
  • Some of what you need to do to make sure your landing pages, sticky bars, and popups adhere to the new rules.
Note: This post isn’t the be-all-and-end-all on EU data privacy, nor is it legal advice. It’s meant to provide background information and help you better understand how you can use Unbounce in a GDPR compliant way.

Data Protection by Default for You and Your Customers

For several months now, Unbounce has been investing heavily in the necessary changes to be GDPR compliant as a conversion platform. We believe that to build trust and confidence with your customers, you need to make their privacy your priority.

As of the day of GDPR enforcement, you can be sure we’ve got your back when it comes to processing and storing your data within Unbounce, and giving you the tools you need to run compliant campaigns.

To see exactly what Unbounce has been doing, why it matters and where we’re at in development, check out our GDPR FAQ page.

But while we’re a GDPR compliant platform with privacy and security safeguards built into our business practices and throughout our platform, this is only part of the equation. There are still a few things you are responsible for to use Unbounce in a compliant way, including:

  • Obtaining consent from your visitors (lawful basis of processing)
  • Linking to your privacy policy (informing visitors of your data protection policies)
  • Deleting personal data if requested (right to erasure)
  • Encrypting lead data at transit and in rest (using SSL) and
  • Signing a data processing addendum (DPA) with Unbounce

Here’s what you’re gonna want to watch for as you build landing pages, popups, and sticky bars.

Obtaining Consent From Your Visitors

Before collecting someone’s data the GDPR states you must first have a legal basis to do so. There are six lawful bases of processing under the GDPR, but if you’re a digital marketer, your use case will most likely fall into one of the following three:

  1. Consent (i.e. opt-in)
  2. Performance of a contract (eg. sending an invoice to a customer)
  3. “Legitimate interest” (eg. Someone is an existing customer and you want to send them information related to a product or service they already have.)

If you are using Unbounce for lead gen, then you must gather consent via opt-in to collect, use, or store someone’s data. When building your landing pages in Unbounce, you can easily add an opt-in field to your forms with the Unbounce form builder:

Keep in mind: Your visitors must actively check your opt-in box to give consent. Pre-checked checkboxes are not a valid form of consent.

Related But Different: Cookies And The ePrivacy Regulation

In many posts you’ll see Europe’s ePrivacy regulations tied in with GDPR, but they are, in fact, two separate things. While the GDPR regulates the general use and management of personal data, cookie use is core to the ePrivacy regulation (which is why you’ll sometimes see it called the “cookie law”). ePrivacy regulations are still in the works, but it’s certain they will be about visitor consent to cookies on your site.

We know the ePrivacy directive requires “prior informed consent” to store or access information on your visitors’ device. In other words, you must ask visitors if they consent to the use of cookies before you start to use them.

Last year Unbounce launched sticky bars (a discreet, mobile-friendly way to get more conversions), but they do double duty as a cookie bar, notifying your visitors about cookies.

You can design and publish a cookie bar using Unbounce’s built-in template, as shown below, embed the code across all of your landing pages using script manager, then promptly publish to every landing page you build in Unbounce. You can even have it appear all across your website.

Informing Visitors of Your Data Protection Policies

It’s not enough to just obtain consent, the GDPR also requires you to inform your customers and prospects what they are consenting to. This means that you need to provide easy access to your privacy and data protection policies (something Google AdWords has required for ages).

Sharing your privacy and data protection policies easily and transparently can help you earn the trust and confidence of your web visitors. Every visitor may not read through it with a fine tooth comb, but in a web littered with sketchy marketing practices, sharing your policies shows that you’re legit and that you have nothing to hide.

In the Unbounce landing page builder you can have any image, button or text link on your page open in a popup lightbox window. This means that you can link to the privacy policy already hosted on your website in a popup window on-click, and still keep visitors on your page to boost engagement and conversion rates.

This is a great example of how doing right by your customers can also help you achieve your business goals.

Here you can see a button being added to an Unbounce page linking through to a privacy policy. Something you need to do going forward to be compliant.

The Right To Be Forgotten

At any point in time a customer or lead whose data you have collected can request that you erase any of their personal data you have stored. There are several grounds under which someone can make this request and the GDPR requires that you do so without “undue delay”.

As an Unbounce customer, simply submit an email request to our support team who will ensure that all information for a specific lead or a group of leads are deleted from our database.

As part of our ongoing commitment to supporting data privacy and security, we are inspecting alternate solutions to deletion requests, but you can rest assured that even as of today, we will fulfill deletion requests within the time limit enforced by the GDPR.

Preventing Unauthorized Access to Data

Unbounce has supported SSL encryption on landing pages for years, and we’re proud that we made this a priority for our customers before Google started calling out non-https pages as not secure and giving preferential treatment to secure pages.

Presently Unbounce customers can already adhere to the GDPR requirement to process all data securely.

When you build and publish your landing pages with Unbounce, you can force your web visitors to the secure (https) version of your pages, even if they accidentally navigate to the unsecure (http) version.

In the upper right corner you can toggle to force visitors to the secure HTTPS version of your page.

This forced redirect will ensure proper encryption of your visitor lead data in transit and at rest. And as an added bonus, it’ll keep you in Google’s good books and prevent ‘not secure’ warnings in Google Chrome.

Signing a Data Protection Addendum (DPA) With Unbounce

According to the GDPR, when you collect lead information with Unbounce, you are the data controller while Unbounce serves as your data processor. To comply with GDPR regulation when using a tool like a landing page builder or conversion platform, you need a signed DPA between you (the data controller) and the service provider (your data processor).

Without getting too deep into the weeds on this one, let me just say that if you’re using Unbounce, we’ve got you covered and that you can complete a form on our GDPR overview page to get your DPA by email.

Privacy = Trust = Great Marketing

At Unbounce we view data privacy and security as two cornerstones of great marketing. At their core they are about a positive user experience and can help make the internet a better place.

The GDPR puts more control in the hands of users to determine how their information is used. No one wants their personal data falling into the wrong hands or being used in malicious or intrusive ways. Confidence and trust in your brand is at stake when it comes to privacy, so we aren’t taking any chances. Using Unbounce as your conversion platform, you can assure your customers that you take their privacy and data security seriously.

Increased regulation around data privacy may provide short term challenges for marketers as we establish new norms, but long term they can provide a more positive experience for users — something we should always strive for as marketers.


How to Develop the Right Idea at the Right Time

How to Develop the Right Idea at the Right Time written by John Jantsch read more at Duct Tape Marketing

Marketing Podcast with Allen Gannett
Podcast Transcript

Allen Gannett

My guest for this week’s episode of the Duct Tape Marketing Podcast is Allen Gannett. He is the CEO and co-founder of TrackMaven, a marketing insights platform. He and I discuss his new book, The Creative Curve: How to Develop the Right Idea at the Right Time.

Gannett’s mission in life is to make people realize and live up to their potential. He believes “creativity” is accessible to all, most people just don’t have the right tools.

He has been on the “30 Under 30” lists for both Inc. and Forbes and is a contributor for where he writes on the intersection of technology and human nature. Previously, he was a co-founder and General Partner of Acceleprise Ventures, the leading SaaS startup accelerator. He was also once a runner-up on Wheel of Fortune.

Questions I ask Allen Gannett:

  • What is the science behind creativity?
  • What are the four laws of the Creative Curve?
  • How has the research you’ve done impacted your own business?

What you’ll learn if you give a listen:

  • Why creativity is misunderstood
  • What you need in order to develop a skill
  • How Gannett’s research influenced his views on hiring

Key takeaways from the episode and more about Allen Gannett:

Like this show? Click on over and give us a review on iTunes, please!

Duct Tape Marketing

Transcript of How to Develop the Right Idea at the Right Time

Transcript of How to Develop the Right Idea at the Right Time written by John Jantsch read more at Duct Tape Marketing

Back to Podcast


John Jantsch: Hello and welcome to another episode of the Duct Tape Marketing podcast. This is John Jantsch, and my guest today is Allen Gannett. He is the CEO and founder of TrackMaven, a marketing insights platform. He’s also the author of a book we’re going to talk about today called The Creative Curve: How to Develop the Right Idea at the Right Time. Allen, thanks for joining me.

Allen Gannett: Thanks for having me, man.

John Jantsch: A big premise of the book is to kind of debunk the creativity myth that you sit around and get this inspiration from a muse at some point in your life and that, in fact, there’s a science behind it. You want to tell me kind of your … it’s really the big idea of the book, I suppose, so you want to unpack that for us?

Allen Gannett: Creativity is one of those things that we talk about a lot in our culture. It’s on the cover of all these magazines. It’s this big topic in boardrooms. In Western culture, we have this notion of creativity as this magical, mystical thing that strikes a few certain people each generation, and there’s the Elon Musk and Steve Jobs of the world and the Mozarts and the JK Rowlings, but for the rest of us normies, we’re just sort of left out in the cold.

Allen Gannett: The thing that always bothered me is I’d always been someone who’d been a big reader of autobiographies and some of the literature around creativity. I run a marketing analytics company, so I spend a lot of time with marketers, and I didn’t realize the extent to which this had internalized with people. I thought people sort of knew that was the story but knew that, of course, that’s not actually how it works. I realized that, no, no, this is really how people believe creativity works, and so the book sort of came out of this frustration I had that I saw all these very smart people limiting their potential.

Allen Gannett: The book is split into two halves. The first half of the book I interviewed all of the living academics who study creativity, and I break down the myths around how creativity works using science and some of the real histories. I tell some of the real stories behind things like Paul McCartney’s creation of the song Yesterday, which has been over-hyped and over-sold for decades, and Mozart, which there was a whole bunch of, literally, things like forged letters and forged articles about Mozart that have become part of our common myths around Mozart.

Allen Gannett: In the second half of the book, I interviewed about 25 living creative geniuses. These are everyone from billionaires like David Rubenstein, Ted Sarandos, the chief content officer at Netflix, Nina Jacobson, the former president of Walt Disney Motion Pictures. She’s the producer of The Hunger Games. I interviewed even folks like Casey Neistat from YouTube and … really eclectic set of creative geniuses with the goal of saying, okay, if the science shows us that you can actually learn to become more creative, well then how have people actually done that? How have they accomplished that? The book is meant to both be a sort of myth-busting book but also actually be a practical guide to actually leveraging this yourself.

John Jantsch: I think there’s actually a lot of misunderstanding or misuse of the word creativity anyway.

Allen Gannett: Oh, totally.

John Jantsch: I do think that a lot of people that I run into, “Oh, I’m not creative,” which means, “I can’t paint like Picasso,” or something when, in fact, in my business, I’m not … If you set me down and say, “Make something,” I’m not a maker, but I could … I’ve built my entire career around taking other ideas and seeing how they fit together better, and I think that’s a creative science.

Allen Gannett: Oh, and totally, and this is one of the things that people … We have sort of a book cover mentality of creativity, I like to call it, where I wrote a book, there’s one name on the cover, but there’s so many people involved who are creative who make that happen. I mean there’s agents, editors, marketers, copy editors, proofreaders, research assistants, feedback readers, right? Every creative endeavor you see actually has a lot of different people involved, but we sort of have this book cover phenomenon, or I sometimes call it the front man phenomenon. In a band, we talk about the lead singer all the time even though there’s five people in the band. With creativity, we sort of talk about Steve Jobs and Elon Musk as if they’re these sort of Tony Stark-esque characters, and we forget the fact that Steve Jobs had Steve Wozniak. Elon Musk literally has the world’s best rocket scientists working for him.

Allen Gannett: The idea that these people are rolling these boulders up a hill by themselves is just not true, and so I think we’re surprisingly susceptible to these sort of PR person propagated narratives around creativity, because I also think, John, we kind of like it. We kind of like the idea that there’s something out there for all of us that’s going to be easy. When we talk about our passion, I think we’re slightly actually talking about, well, waiting for something to be easy, but nothing in life is easy.

Allen Gannett: You look at Mozart, and we talk about him as if he popped out of the womb playing the piano, but the reality is, when he was three years old, his dad, who’s basically a helicopter dad, was like, “You need to become a great musician.” Under the conditional love of his father, he started taking lessons with literally the best music teachers in all of Europe, and he practiced three hours seven days a week his entire childhood. This is not the story of it being easy for Mozart. This is the story of him doing the really hard part when he was young. I think we like this idea that, for some people, it’s easier, for some things it easy, because it kind of gives us an excuse.

John Jantsch: Well, and I also think that the narrative that is simple is a really useful device too because people can then share it, and they don’t have to … What you just went through, nobody wants to tell that story.

Allen Gannett: Of course, 100%. Everyone wants to believe it’s just straightforward.

John Jantsch: Yeah. I think you go as far as saying that just about anybody with the right motivation and the right process could practice and develop a skill, so let’s … Since I mentioned Picasso, could I paint if I had the right motivation?

Allen Gannett: Yes.

John Jantsch: I mean, right now, I will tell you I can’t.

Allen Gannett: Yes.

John Jantsch: I don’t think I could paint anything that anybody would see commercially interesting, but-

Allen Gannett: Totally.

John Jantsch: Right.

Allen Gannett: There’s two different parts of creativity. There’s the technical skill, and then there’s creating the right idea at the right time. On the technical skill side, we actually have now decades of research on talent development. What’s amazing, this is something I didn’t … I didn’t expect it to be this much of a consensus when I started writing the book, but the people, the researchers who spend their time studying talent development have come to the conclusion that, at best, natural-born talent is very rare and [wholefully 00:06:47] overblown, but more likely than not, the idea of natural-born talent actually doesn’t really exist.

Allen Gannett: It’s really that these people typically start very young. They have access to a lot of resources or maybe they were working on another skill, like the daughter who always played baseball in the backyard with her dad and then, by the time she was 12 and she went to her first-ever track practice, she was such a fast runner, and they’re like how did she learn this? It’s like, well, she was playing baseball in the backyard for seven years.

Allen Gannett: In the book, I actually profile the story … It’s actually one of the few stories we have of someone tracking their skill development over a long period of time. It’s the story of Jonathan Hardesty, who’s this painter who, at the age of 22, having never painted before, decided that he wanted to become a professional painter, and he proceeded to … For whatever reason, he was active on a online forum, and he created this forum thread which said that, “Every day, I’m going to post a picture of my painting. I’m going to paint every single day,” and for the next 13 years he did this, 13 years.

Allen Gannett: It’s a really amazing story being able to see he was such a terrible painter when he started. I got permission from him to use one of his first-ever sketches in the book and one of his sketches from much later, and it’s shocking. What he did is he followed, actually, all of the best practices that we have from research on talent and skill development on becoming a great painter, and now he teaches all these courses and classes on becoming a fine art painter and all this stuff, and his paintings sell for five figures, and so he’s a really great rare example of someone starting when they’re old. I think it’s hard because, when you’re older, you’re busy. You don’t have that much time, and there’s not a father or mother figure sort of bearing down on you, forcing you to get through the hard part.

John Jantsch: Well, and I do want to get to your four laws of the creative curve because I think that’s … obviously, that’s a big part of the book, but I think it’s also … I think people need to hear that process, but I want to start with something before that. One of the things that I have observed in my own life and in watching a lot of other people is that motivation has a tremendous amount to do with this.

John Jantsch: I’ll give you an example. I taught myself how to play the guitar when I was in junior high, and it wasn’t because I ever envisioned becoming a famous rock star. I saw it as a great … It turns out junior high girls love guitar players. That was a huge motivation for me to just take this thing on and do it myself. As silly as that example is, I think that that is probably the key to unlocking the whole thing. Isn’t it?

Allen Gannett: I mean this is one of the things that people sort of don’t realize. I think the reason why we see so many young people who seem to be very creative, it’s because their parents forced them. Right?

John Jantsch: Right, right.

Allen Gannett: That’s powerful [inaudible 00:09:37]. It’s Freudian. It’s developmental, whatever sort of psychological perspective you want to put on it, but over and over again we see that the idea of a stage parent is actually … plays a huge role in a lot of these young, creative lives. It’s a lot easier to be world-class by the time you’re 30 if you started when you were 3 than if you started when you were 25.

John Jantsch: Right, right, right. Yeah, I had to beg my parents to buy a used guitar, by the way. All right, so let’s talk about, then, the four laws because I do think that a lot of … there are definitely a lot of people, this is kind of ironic, a lot of people that are more left brain, and they need a process to be creative. I mean it makes total sense. You should pick up the bird, the book, I’m sorry, The Creative Curve.

Allen Gannett: And the bird.

John Jantsch: And the bird, to get really in-depth in this, but I’d like Allen to introduce his four laws.

Allen Gannett: Yeah. Basically, when we talk about creativity, there’s two types of creativity. There’s lower-case C creativity, and there’s upper-case C creativity. This is how academics differentiate them. Lower-case C creativity is just like creating something new. Upper-case C creativity is what most of us actually want to do, which is creating something that’s both new and valuable. Value is a subjective assessment, right? Creating something that we deem society to be valuable, well, people have to see it. They have to experience it. They have to deem it valuable, so there’s a bit of a circular phenomenon that happens.

Allen Gannett: The back half of the book deals with this sort of upper-case C creativity. How do you actually get this? How do you actually develop the right idea at the tight time? It turns out that we actually have a lot of really good science about what drives human preference. I explained it a lot more in detail in the book, but the short version is that we like ideas that are a blend of the familiar and the novel. They’re not too unfamiliar to be scary, because we’re biologically worried to fear the unfamiliar because we worry it might kill us, like if we went to a cave as a caveman that we’d never been in before versus a cave we’ve been in many times, but then we also … turns out we like things that are novel because they represent potential sources of reward. You can think about when we were hunter-gatherers why this was important.

Allen Gannett: These two seemingly contradictory ideas, our fear of the unfamiliar and our pursuit of the novelty, lead to this really elegant relationship where we like ideas that are a blend of the familiar and the novel. The first Star Wars, for example, was a Western in space. Right now, every city has a bunch of these sushi burrito places popping up. They’re just giant sushi rolls. They’re familiar but they’re novel. You see that this is a huge driver of human behavior, and so the four laws really explain how do you nail this timing?

Allen Gannett: The first law that I talk about is consumption. We talk about how creatives are always doing. They’re very active. There’s that annoying social media meme you might have seen, which is like, “90% of people consume, 9% engage, 1% create. #HUSTLE.” It’s not only stupid, but it’s also wrong because it actually turns out that, since familiarity is such an important part of the creative process, consumption, so you know what’s already out there, is actually a huge part of it, and so I talk about why and how.

Allen Gannett: Ted Sarandos, the chief content officer of Netflix, told me this wonderful story about how he started his career as a video store clerk who watched every single movie in the store. JK Rowling, when she was a kid, would close her bedroom door and just read book after book after book after book. The second-

John Jantsch: Right. I think the piece that maybe people are tripping up on is what I just heard you describe. It was intentional consumption.

Allen Gannett: Exactly, so it’s actually … What’s really interesting-

John Jantsch: It’s not just like, “Oh, I’m going to go on Facebook and see all the blah, blah, blah.” There’s intent in what you’re doing.

Allen Gannett: Yes, and it’s not just how much they consume, but it’s … exactly. It’s how they consume, and that goes into the second law, which is imitation. How these great creatives actually consume is in this way that’s very interactive. The best way you could summarize it is they’re imitating it.

Allen Gannett: I tell the story in the book about Ben Franklin and how we think of him as this great writer but, at the age of 18, he viewed himself as a terrible writer, probably because his dad told him so, again, this parent thing. He decided that he was going to start imitating some of the structures of articles he loved in a magazine called The Spectator. What you see is this sort of Mad Libification by these creative geniuses of other creative works where, instead of just reading a novel, they’ll outline, well, how is it structured? What’s the story arc?

Allen Gannett: Kurt Vonnegut, for his master’s thesis, literally created these charts showing the different story arcs of great novels, and this was one of the foundational things for him as a storyteller. You see that it’s not just that these great creatives consume a lot, and they do, but they also do it in a way which is much more interactive than we typically do and much more focused on imitation. That’s this-

John Jantsch: Yeah. Actually, a process that I’ve used for years in writing my books … I wrote a book called The Referral Engine, and so I’m looking for ideas on building community, and referrals, and different word-of-mouth things. I’ll read book that are unrelated to business, on math, on architecture. It’s amazing. When you go into it with that filter, I’m looking for ideas that I could apply to community building and referrals, and it’s amazing how the book is a whole different book in that [crosstalk 00:15:09]-

Allen Gannett: Oh, 100%. I mean I obviously … If you ever want to feel a lot of pressure, write a book on creating hits.

John Jantsch: Yeah, right.

Allen Gannett: It’s a lot of pressure, or write a book on creativity, and it has all this meta stuff to it. I mean, for me, it was like one of the things I, as a first-time author, was struggling with was the best way to go to switch between chapters. It’s just something I didn’t have a natural knack for, and so I went … ended up, as I was writing the book, using a lot of the methods in the book, and so going and seeing some of the different ways that other people did it. That helped give me the framework for realizing, okay, what are the different was I can do it? What do I like? What do I not like? How can I repurpose this in a way that fits my voice and my style versus, if I just kept sitting there looking at it and hoping an idea would hit me, I’d still be here, right, thinking how to end my chapters.

John Jantsch: All right, so I think we’re up to number three, creative [crosstalk 00:15:57]-

Allen Gannett: Okay, number three. Yeah, so number three I talk about in the book is that we think of these creative geniuses as these solo actors, Steve Jobs, Elon Musk, Oprah, but reality is, since there’s this social construct element to creativity, since it’s about what is valuable, you actually have to have a lot of different people involved, and I describe the different roles that you have to have in your creative communities, and there’s four that I talk about in the book.

Allen Gannett: Then the fourth and final law is all about data-driven iterations. I think we have this notion of the novelist who goes into the woods and writes their book in a writing cabin and, only once they write the end, period, do the come out. The reality is that, since these … The creatives who are the best at it realize that there’s this whole social construct element, that the relationship with their audience is so important that they are actually very focused on, early and often, getting feedback and then using that to iterate over and over again.

Allen Gannett: I talk about, in the book, everything from the movie industry to romance writers to … One of my favorite stories is I spent a day with the flavor team at Ben & Jerry’s who creates new flavors. That process, which is a culinary process, is shockingly data-driven. They literally do surveys and all this fascinating stuff. It’s not super expensive what they’re doing, they use a lot of email surveys, but it is data-driven.

Allen Gannett: I think that’s one of the big mistakes that aspiring creators have is that, oftentimes, aspiring creators are creating for themselves, and they’re not creating for their audience. The best creators are creating for their audience. Since they know that, they are much more likely to actually listen to their audience.

John Jantsch: Well, and it’s interesting. Over the last decade, I think that the adoption of blogging, wherever that is today, 10 years ago, I think some … there were a heck of a lot of authors that were iterating every day-

Allen Gannett: Completely.

John Jantsch: … because they were writing content that eventually made it into a book. I know I’ve done that numerous times, and I’ve seen a lot of other people that their blogs kind of blew up into books because of comments, and feedback, and the ability to say, “Oh, that resonated. I should go deeper there.” I think there are plenty of examples of a lot of books that became big hits started out as daily blogs.

Allen Gannett: Oh, 100%, and you see this, and they become … I mean Gary Vaynerchuk’s done a great job of this, right, just sort of getting community feedback, Tim Ferriss, obviously. You see this a lot of times. You’ll see these guys, they’ll … Even journalists will write an article for The New Yorker. It does really well. It goes viral. Then they’ll sell the book, and then they’ll sort of work through that.

Allen Gannett: The reality is that the best creative processes are messy, and gross, and involve lots of shades of gray, and all this stuff. I think we have this romantic notion. JK Rowling’s a great example. I mean the story about JK Rowling is she was on a train. She had the idea for Harry Potter. She started writing it on a napkin. First of all, she didn’t have a napkin. She didn’t have a pen. She was on a train. She had the idea for the character Harry Potter and some of his sidekicks, but then it took her five years to write the first book, five years. In one interview, she actually showed the interviewer the box of all 15 different versions of Chapter One she had written because she couldn’t figure out how she wanted to start the book, 15 different versions. This is not the story of her waking up one day with a multi-billion-dollar idea.

John Jantsch: No. Yeah, and then the process of selling that book was just as messy.

Allen Gannett: Yeah, totally. I interviewed, for the book, her first agent and her first publisher. I mean, that book, there was thought behind how to roll it out to the market. They were very mindful of how to do it.

John Jantsch: Yeah. Well, and the rest is history, of course, but you’re right. I mean I do think that we have a tendency in our culture, the social media, YouTube culture, to really kind of hold those ideas out there and think of the billions of other successes that we’ve never heard of that probably went through the same process. I mean they were successful in a different way at a different level, but we obviously all look at all of the stories that hit the one or two kind of social media viral hits.

Allen Gannett: Totally.

John Jantsch: Tell me a little bit about how this research that you’ve done has shaped or evolved your own business TrackMaven.

Allen Gannett: Oh, I mean it’s super interesting. One, it’s affected how I coach people. I think I always had confidence that people were generally underselling themselves when it came to their own talents and development, but writing this book, which took me even further on the side that natural-born talent doesn’t really exist, has made me, I think, a much more practical but also much more aggressive coach to my team where I think I really push people hard to get rid of those things they’ve put on themselves. I mean there’s these famous studies that were done in the ’90s where 86% of kindergartners tested at creative genius levels of creative potential, but I think it was like 16% of high school seniors, something in the teens.

Allen Gannett: Yeah, and it’s like … and you totally see this. There’s this entire social set of constructs we’ve put in ourselves, the social conditioning where we believe that we were meant to be X, and we can’t be Y, and it’s so, so, so, so, so much not real. It’s just in our heads. It’s what we’ve been told. It’s the result of middle-class parents telling kids to get their safe job, to be professional, whatever it is. I think it’s really dangerous, and so, for me as a manager and as a leader, I think I have become much more aggressive at trying to coach people out of that.

John Jantsch: Yeah. I think that times have changed a bit, but a lot of high school kids, the creatives were the nerds. You know?

Allen Gannett: Yeah.

John Jantsch: Of course, now they’re running the world, but I think that actually … Somebody who was really … peer pressure stopped them from pursuing kind of an interest because of that. I think that’s the real shame-

Allen Gannett: Exactly.

John Jantsch: … in not kind of bringing this out as, hey, this is the cool kids or whatever we want to call it now, so it’s interesting, as I heard you talk about that, I wonder what the implications are just for hiring in general.

Allen Gannett: I think I tend to very much focus hiring around potential. I tend not to be … and this is obviously as a young CEO. I think, also, you just tend to be a little more experience skeptical because you also see the downsides of experience around people having their own cognitive biases around previous experience and, “This worked before, so I’m going to do that again.” I tend to think I’m much more potential-oriented. The result is we have a lot of managers who are sort of battlefield promotions, so to speak, where they’ve grown up in the organization, and I think that makes them … They know a lot of the context. They’re more loyal, all that sort of stuff. I think that’s probably the biggest change for me as a leader is just really, yeah, being willing to take more risks on who I hire.

John Jantsch: Yeah. I mean I think we need creativity out of every position, so I guess if you make that a part of the process where you’re going to, as you said, coach and teach a process of creativity or at least to bring out the creativity in everybody, then there isn’t any reason to necessarily just say, “Oh, you have a creative background.”

Allen Gannett: Exactly.

John Jantsch: Allen, tell people where they can get the book and find out more about TrackMaven and everything else you’re up to.

Allen Gannett: You can check out the book at and anywhere books are sold. Check out and for more on me.

John Jantsch: All right. Thanks, Allen. Hopefully, we’ll run into you out there in the world someday.

Allen Gannett: Bye.

Duct Tape Marketing

WTF is dark pattern design?

If you’re a UX designer you won’t need this article to tell you about dark pattern design. But perhaps you chose to tap here out of a desire to reaffirm what you already know — to feel good about your professional expertise.

Or was it that your conscience pricked you? Go on, you can be honest… or, well, can you?

A third possibility: Perhaps an app you were using presented this article in a way that persuaded you to tap on it rather than on some other piece of digital content. And it’s those sorts of little imperceptible nudges — what to notice, where to tap/click — that we’re talking about when we talk about dark pattern design.

But not just that. The darkness comes into play because UX design choices are being selected to be intentionally deceptive. To nudge the user to give up more than they realize. Or to agree to things they probably wouldn’t if they genuinely understood the decisions they were being pushed to make.

To put it plainly, dark pattern design is deception and dishonesty by design… Still sitting comfortably?

The technique, as it’s deployed online today, often feeds off and exploits the fact that content-overloaded consumers skim-read stuff they’re presented with, especially if it looks dull and they’re in the midst of trying to do something else — like sign up to a service, complete a purchase, get to something they actually want to look at, or find out what their friends have sent them.

Manipulative timing is a key element of dark pattern design. In other words when you see a notification can determine how you respond to it. Or if you even notice it. Interruptions generally pile on the cognitive overload — and deceptive design deploys them to make it harder for a web user to be fully in control of their faculties during a key moment of decision.

Dark patterns used to obtain consent to collect users’ personal data often combine unwelcome interruption with a built in escape route — offering an easy way to get rid of the dull looking menu getting in the way of what you’re actually trying to do.

Brightly colored ‘agree and continue’ buttons are a recurring feature of this flavor of dark pattern design. These eye-catching signposts appear near universally across consent flows — to encourage users not to read or contemplate a service’s terms and conditions, and therefore not to understand what they’re agreeing to.

It’s ‘consent’ by the spotlit backdoor.

This works because humans are lazy in the face of boring and/or complex looking stuff. And because too much information easily overwhelms. Most people will take the path of least resistance. Especially if it’s being reassuringly plated up for them in handy, push-button form.

At the same time dark pattern design will ensure the opt out — if there is one — will be near invisible; Greyscale text on a grey background is the usual choice.

Some deceptive designs even include a call to action displayed on the colorful button they do want you to press — with text that says something like ‘Okay, looks great!’ — to further push a decision.

Likewise, the less visible opt out option might use a negative suggestion to imply you’re going to miss out on something or are risking bad stuff happening by clicking here.

The horrible truth is that deceptive designs can be awfully easy to paint.

Where T&Cs are concerned, it really is shooting fish in a barrel. Because humans hate being bored or confused and there are countless ways to make decisions look off-puttingly boring or complex — be it presenting reams of impenetrable legalese in tiny greyscale lettering so no-one will bother reading it combined with defaults set to opt in when people click ‘ok’; deploying intentionally confusing phrasing and/or confusing button/toggle design that makes it impossible for the user to be sure what’s on and what’s off (and thus what’s opt out and what’s an opt in) or even whether opting out might actually mean opting into something you really don’t want…

Friction is another key tool of this dark art: For example designs that require lots more clicks/taps and interactions if you want to opt out. Such as toggles for every single data share transaction — potentially running to hundreds of individual controls a user has to tap on vs just a few taps or even a single button to agree to everything. The weighing is intentionally all one way. And it’s not in the consumer’s favor.

Deceptive designs can also make it appear that opting out is not even possible. Such as default opting users in to sharing their data and, if they try to find a way to opt out, requiring they locate a hard-to-spot alternative click — and then also requiring they scroll to the bottom of lengthy T&Cs to unearth a buried toggle where they can in fact opt out.

Facebook used that technique to carry out a major data heist by linking WhatsApp users’ accounts with Facebook accounts in 2016. Despite prior claims that such a privacy u-turn could never happen. The vast majority of WhatsApp users likely never realized they could say no — let alone understood the privacy implications of consenting to their accounts being linked.

Ecommerce sites also sometimes suggestively present an optional (priced) add-on in a way that makes it appear like an obligatory part of the transaction. Such as using a brightly colored ‘continue’ button during a flight check out process but which also automatically bundles an optional extra like insurance, instead of plainly asking people if they want to buy it.

Or using pre-selected checkboxes to sneak low cost items or a small charity donation into a basket when a user is busy going through the check out flow — meaning many customers won’t notice it until after the purchase has been made.

Airlines have also been caught using deceptive design to upsell pricier options, such as by obscuring cheaper flights and/or masking prices so it’s harder to figure out what the most cost effective choice actually is.

Dark patterns to thwart attempts to unsubscribe are horribly, horribly common in email marketing. Such as an unsubscribe UX that requires you to click a ridiculous number of times and keep reaffirming that yes, you really do want out.

Often these additional screens are deceptively designed to resembled the ‘unsubscribe successful’ screens that people expect to see when they’ve pulled the marketing hooks out. But if you look very closely, at the typically very tiny lettering, you’ll see they’re actually still asking if you want to unsubscribe. The trick is to get you not to unsubscribe by making you think you already have. 

Another oft-used deceptive design that aims to manipulate online consent flows works against users by presenting a few selectively biased examples — which gives the illusion of helpful context around a decision. But actually this is a turbocharged attempt to manipulate the user by presenting a self-servingly skewed view that is in no way a full and balanced picture of the consequences of consent.

At best it’s disingenuous. More plainly it’s deceptive and dishonest.

Here’s just one example of selectively biased examples presented during a Facebook consent flow used to encourage European users to switch on its face recognition technology. Clicking ‘continue’ leads the user to the decision screen — but only after they’ve been shown this biased interstitial…

Facebook is also using emotional manipulation here, in the wording of its selective examples, by playing on people’s fears (claiming its tech will “help protect you from a stranger”) and playing on people’s sense of goodwill (claiming your consent will be helpful to people with visual impairment) — to try to squeeze agreement by making people feel fear or guilt.

You wouldn’t like this kind of emotionally manipulative behavior if a human was doing it to you. But Facebook frequently tries to manipulate its users’ feelings to get them to behave how it wants.

For instance to push users to post more content — such as by generating an artificial slideshow of “memories” from your profile and a friend’s profile, and then suggesting you share this unasked for content on your timeline (pushing you to do so because, well, what’s your friend going to think if you choose not to share it?). Of course this serves its business interests because more content posted to Facebook generates more engagement and thus more ad views.

Or — in a last ditch attempt to prevent a person from deleting their account — Facebook has been known to use the names and photos of their Facebook friends to claim such and such a person will “miss you” if you leave the service. So it’s suddenly conflating leaving Facebook with abandoning your friends.

Distraction is another deceptive design technique deployed to sneak more from the user than they realize. For example cutesy looking cartoons that are served up to make you feel warn and fluffy about a brand — such as when they’re periodically asking you to review your privacy settings.

Again, Facebook uses this technique. The cartoony look and feel around its privacy review process is designed to make you feel reassured about giving the company more of your data.

You could even argue that Google’s entire brand is a dark pattern design: Childishly colored and sounding, it suggests something safe and fun. Playful even. The feelings it generates — and thus the work it’s doing — bear no relation to the business the company is actually in: Surveillance and people tracking to persuade you to buy things.

Another example of dark pattern design: Notifications that pop up just as you’re contemplating purchasing a flight or hotel room, say, or looking at a pair of shoes — which urge you to “hurry!” as there’s only X number of seats or pairs left.

This plays on people’s FOMO, trying to rush a transaction by making a potential customer feel like they don’t have time to think about it or do more research — and thus thwart the more rational and informed decision they might otherwise have made.

The kicker is there’s no way to know if there really was just two seats left at that price. Much like the ghost cars Uber was caught displaying in its app — which it claimed were for illustrative purposes, rather than being exactly accurate depictions of cars available to hail — web users are left having to trust what they’re being told is genuinely true.

But why should you trust companies that are intentionally trying to mislead you?

Dark patterns point to an ethical vacuum

The phrase dark pattern design is pretty antique in Internet terms, though you’ll likely have heard it being bandied around quite a bit of late. Wikipedia credits UX designer Harry Brignull with the coinage, back in 2010, when he registered a website ( to chronicle and call out the practice as unethical.

“Dark patterns tend to perform very well in A/B and multivariate tests simply because a design that tricks users into doing something is likely to achieve more conversions than one that allows users to make an informed decision,” wrote Brignull in 2011 — highlighting exactly why web designers were skewing towards being so tricksy: Superficially it works. The anger and mistrust come later.

Close to a decade later, Brignull’s website is still valiantly calling out deceptive design. So perhaps he should rename this page ‘the hall of eternal shame’. (And yes, before you point it out, you can indeed find brands owned by TechCrunch’s parent entity Oath among those being called out for dark pattern design… It’s fair to say that dark pattern consent flows are shamefully widespread among media entities, many of which aim to monetize free content with data-thirsty ad targeting.)

Of course the underlying concept of deceptive design has roots that run right through human history. See, for example, the original Trojan horse. (A sort of ‘reverse’ dark pattern design — given the Greeks built an intentionally eye-catching spectacle to pique the Trojan’s curiosity, getting them to lower their guard and take it into the walled city, allowing the fatal trap to be sprung.)

Basically, the more tools that humans have built, the more possibilities they’ve found for pulling the wool over other people’s eyes. The Internet just kind of supercharges the practice and amplifies the associated ethical concerns because deception can be carried out remotely and at vast, vast scale. Here the people lying to you don’t even have to risk a twinge of personal guilt because they don’t have to look into your eyes while they’re doing it.

Nowadays falling foul of dark pattern design most often means you’ll have unwittingly agreed to your personal data being harvested and shared with a very large number of data brokers who profit from background trading people’s information — without making it clear they’re doing so nor what exactly they’re doing to turn your data into their gold. So, yes, you are paying for free consumer services with your privacy.

Another aspect of dark pattern design has been bent towards encouraging Internet users to form addictive habits attached to apps and services. Often these kind of addiction forming dark patterns are less visually obvious on a screen — unless you start counting the number of notifications you’re being plied with, or the emotional blackmail triggers you’re feeling to send a message for a ‘friendversary’, or not miss your turn in a ‘streak game’.

This is the Nir Eyal ‘hooked’ school of product design. Which has actually run into a bit of a backlash of late, with big tech now competing — at least superficially — to offer so-called ‘digital well-being’ tools to let users unhook. Yet these are tools the platforms are still very much in control of. So there’s no chance you’re going to be encouraged to abandon their service altogether.

Dark pattern design can also cost you money directly. For example if you get tricked into signing up for or continuing a subscription you didn’t really want. Though such blatantly egregious subscription deceptions are harder to get away with. Because consumers soon notice they’re getting stung for $ 50 a month they never intended to spend.

That’s not to say ecommerce is clean of deceptive crimes now. The dark patterns have generally just got a bit more subtle. Pushing you to transact faster than you might otherwise, say, or upselling stuff you don’t really need.

Although consumers will usually realize they’ve been sold something they didn’t want or need eventually. Which is why deceptive design isn’t a sustainable business strategy, even setting aside ethical concerns.

In short, it’s short term thinking at the expense of reputation and brand loyalty. Especially as consumers now have plenty of online platforms where they can vent and denounce brands that have tricked them. So trick your customers at your peril.

That said, it takes longer for people to realize their privacy is being sold down the river. If they even realize at all. Which is why dark pattern design has become such a core enabling tool for the vast, non-consumer facing ad tech and data brokering industry that’s grown fat by quietly sucking on people’s data — thanks to the enabling grease of dark pattern design.

Think of it as a bloated vampire octopus wrapped invisibly around the consumer web, using its myriad tentacles and suckers to continuously manipulate decisions and close down user agency in order to keep data flowing — with all the A/B testing techniques and gamification tools it needs to win.

“It’s become substantially worse,” agrees Brignull, discussing the practice he began critically chronicling almost a decade ago. “Tech companies are constantly in the international news for unethical behavior. This wasn’t the case 5-6 years ago. Their use of dark patterns is the tip of the iceberg. Unethical UI is a tiny thing compared to unethical business strategy.”

“UX design can be described as the way a business chooses to behave towards its customers,” he adds, saying that deceptive web design is therefore merely symptomatic of a deeper Internet malaise.

He argues the underlying issue is really about “ethical behavior in US society in general”.

The deceitful obfuscation of commercial intention certainly runs all the way through the data brokering and ad tech industries that sit behind much of the ‘free’ consumer Internet. Here consumers have plainly been kept in the dark so they cannot see and object to how their personal information is being handed around, sliced and diced, and used to try to manipulate them.

From an ad tech perspective, the concern is that manipulation doesn’t work when it’s obvious. And the goal of targeted advertising is to manipulate people’s decisions based on intelligence about them gleaned via clandestine surveillance of their online activity (so inferring who they are via their data). This might be a purchase decision. Equally it might be a vote.

The stakes have been raised considerably now that data mining and behavioral profiling are being used at scale to try to influence democratic processes.

So it’s not surprising that Facebook is so coy about explaining why a certain user on its platform is seeing a specific advert. Because if the huge surveillance operation underpinning the algorithmic decision to serve a particular ad was made clear, the person seeing it might feel manipulated. And then they would probably be less inclined to look favorably upon the brand they were being urged to buy. Or the political opinion they were being pushed to form. And Facebook’s ad tech business stands to suffer.

The dark pattern design that’s trying to nudge you to hand over your personal information is, as Birgnull says, just the tip of a vast and shadowy industry that trades on deception and manipulation by design — because it relies on the lie that people don’t care about their privacy.

But people clearly do care about privacy. Just look at the lengths to which ad tech entities go to obfuscate and deceive consumers about how their data is being collected and used. If people don’t mind companies spying on them, why not just tell them plainly it’s happening?

And if people were really cool about sharing their personal and private information with anyone, and totally fine about being tracked everywhere they go and having a record kept of all the people they know and have relationships with, why would the ad tech industry need to spy on them in the first place? They could just ask up front for all your passwords.

The deception enabled by dark pattern design not only erodes privacy but has the chilling effect of putting web users under pervasive, clandestine surveillance, it also risks enabling damaging discrimination at scale. Because non-transparent decisions made off of the back of inferences gleaned from data taken without people’s consent can mean that — for example — only certain types of people are shown certain types of offers and prices, while others are not.

Facebook was forced to make changes to its ad platform after it was shown that an ad-targeting category it lets advertisers target ads against, called ‘ethnic affinity’ — aka Facebook users whose online activity indicates an interest in “content relating to particular ethnic communities” — could be used to run housing and employment ads that discriminate against protected groups.

More recently the major political ad scandals relating to Kremlin-backed disinformation campaigns targeting the US and other countries via Facebook’s platform, and the massive Facebook user data heist involving the controversial political consultancy Cambridge Analytica deploying quiz apps to improperly suck out people’s data in order to build psychographic profiles for political ad targeting, has shone a spotlight on the risks that flow from platforms that operate by systematically keeping their users in the dark.

As a result of these scandals, Facebook has started offering a level of disclosure around who is paying for and running some of the ads on its platform. But plenty of aspects of its platform and operations remain shrouded. Even those components that are being opened up a bit are still obscured from view of the majority of users — thanks to the company’s continued use of dark patterns to manipulate people into acceptance without actual understanding.

And yet while dark pattern design has been the slickly successful oil in the engines of the ad tech industry for years, allowing it to get away with so much consent-less background data processing, gradually, gradually some of the shadier practices of this sector are being illuminated and shut down — including as a consequence of shoddy security practices, with so many companies involved in the trading and mining of people’s data. There are just more opportunities for data to leak. 

Laws around privacy are also being tightened. And changes to EU data protection rules are a key reason why dark pattern design has bubbled back up into online conversations lately. The practice is under far greater legal threat now as GDPR tightens the rules around consent.

This week a study by the Norwegian Consumer Council criticized Facebook and Google for systematically deploying design choices that nudge people towards making decisions which negatively affect their own privacy — such as data sharing defaults, and friction injected into the process of opting out so that fewer people will.

Another manipulative design decision flagged by the report is especially illustrative of the deceptive levels to which companies will stoop to get users to do what they want — with the watchdog pointing out how Facebook paints fake red dots onto its UI in the midst of consent decision flows in order to encourage the user to think they have a message or a notification. Thereby rushing people to agree without reading any small print.

Fair and ethical design is design that requires people to opt in affirmatively to any actions that benefit the commercial service at the expense of the user’s interests. Yet all too often it’s the other way around: Web users have to go through sweating toil and effort to try to safeguard their information or avoid being stung for something they don’t want.

You might think the types of personal data that Facebook harvests are trivial — and so wonder what’s the big deal if the company is using deceptive design to obtain people’s consent? But the purposes to which people’s information can be put are not at all trivial — as the Cambridge Analytica scandal illustrates.

One of Facebook’s recent data grabs in Europe also underlines how it’s using dark patterns on its platform to attempt to normalize increasingly privacy hostile technologies.

Earlier this year it began asking Europeans for consent to processing their selfies for facial recognition purposes — a highly controversial technology that regulatory intervention in the region had previously blocked. Yet now, as a consequence of Facebook’s confidence in crafting manipulative consent flows, it’s essentially figured out a way to circumvent EU citizens’ fundamental rights — by socially engineering Europeans to override their own best interests.

Nor is this type of manipulation exclusively meted out to certain, more tightly regulated geographies; Facebook is treating all its users like this. European users just received its latest set of dark pattern designs first, ahead of a global rollout, thanks to the bloc’s new data protection regulation coming into force on May 25.

CEO Mark Zuckerberg even went so far as to gloat about the success of this deceptive modus operandi on stage at a European conference in May — claiming the “vast majority” of users were “willingly” opting in to targeted advertising via its new consent flow.

In truth the consent flow is manipulative, and Facebook does not even offer an absolute opt out of targeted advertising on its platform. The ‘choice’ it gives users is to agree to its targeted advertising or to delete their account and leave the service entirely. Which isn’t really a choice when balanced against the power of Facebook’s platform and the network effect it exploits to keep people using its service.

Forced consent‘ is an early target for privacy campaign groups making use of GDPR opening the door, in certain EU member states, to collective enforcement of individuals’ data rights.

Of course if you read Facebook or Google’s PR around privacy they claim to care immensely — saying they give people all the controls they need to manage and control access to their information. But controls with dishonest instructions on how to use them aren’t really controls at all. And opt outs that don’t exist smell rather more like a lock in. 

Platforms certainly remain firmly in the driving seat because — until a court tells them otherwise — they control not just the buttons and levers but the positions, sizes, colors, and ultimately the presence or otherwise of the buttons and levers.

And because these big tech ad giants have grown so dominant as services they are able to wield huge power over their users — even tracking non-users over large swathes of the rest of the Internet, and giving them even fewer controls than the people who are de facto locked in, even if, technically speaking, service users might be able to delete an account or abandon a staple of the consumer web. 

Big tech platforms can also leverage their size to analyze user behavior at vast scale and A/B test the dark pattern designs that trick people the best. So the notion that users have been willingly agreeing en masse to give up their privacy remains the big lie squatting atop the consumer Internet.

People are merely choosing the choice that’s being pre-selected for them.

That’s where things stand as is. But the future is looking increasingly murky for dark pattern design.

Change is in the air.

What’s changed is there are attempts to legally challenge digital disingenuousness, especially around privacy and consent. This after multiple scandals have highlighted some very shady practices being enabled by consent-less data-mining — making both the risks and the erosion of users’ rights clear.

Europe’s GDPR has tightened requirements around consent — and is creating the possibility of redress via penalties worth the enforcement. It has already caused some data-dealing businesses to pull the plug entirely or exit Europe.

New laws with teeth make legal challenges viable, which was simply not the case before. Though major industry-wide change will take time, as it will require waiting for judges and courts to rule.

“It’s a very good thing,” says Brignull of GDPR. Though he’s not yet ready to call it the death blow that deceptive design really needs, cautioning: “We’ll have to wait to see whether the bite is as strong as the bark.”

In the meanwhile, every data protection scandal ramps up public awareness about how privacy is being manhandled and abused, and the risks that flow from that — both to individuals (e.g. identity fraud) and to societies as a whole (be it election interference or more broadly attempts to foment harmful social division).

So while dark pattern design is essentially ubiquitous with the consumer web of today, the deceptive practices it has been used to shield and enable are on borrowed time. The direction of travel — and the direction of innovation — is pro-privacy, pro-user control and therefore anti-deceptive-design. Even if the most embedded practitioners are far too vested to abandon their dark arts without a fight.

What, then, does the future look like? What is ‘light pattern design’? The way forward — at least where privacy and consent are concerned — must be user centric. This means genuinely asking for permission — using honesty to win trust by enabling rather than disabling user agency.

Designs must champion usability and clarity, presenting a genuine, good faith choice. Which means no privacy-hostile defaults: So opt ins, not opt outs, and consent that is freely given because it’s based on genuine information not self-serving deception, and because it can also always be revoked at will.

Design must also be empathetic. It must understand and be sensitive to diversity — offering clear options without being intentionally overwhelming. The goal is to close the perception gap between what’s being offered and what the customer thinks they’re getting.

Those who want to see a shift towards light patterns and plain dealing also point out that online transactions honestly achieved will be happier and healthier for all concerned — because they will reflect what people actually want. So rather than grabbing short term gains deceptively, companies will be laying the groundwork for brand loyalty and organic and sustainable growth.

The alternative to the light pattern path is also clear: Rising mistrust, rising anger, more scandals, and — ultimately — consumers abandoning brands and services that creep them out and make them feel used. Because no one likes feeling exploited. And even if people don’t delete an account entirely they will likely modify how they interact, sharing less, being less trusting, less engaged, seeking out alternatives that they do feel good about using.

Also inevitable if the mass deception continues: More regulation. If businesses don’t behave ethically on their own, laws will be drawn up to force change.

Because sure, you can trick people for a while. But it’s not a sustainable strategy. Just look at the political pressure now being piled on Zuckerberg by US and EU lawmakers. Deception is the long game that almost always fails in the end.

The way forward must be a new ethical deal for consumer web services — moving away from business models that monetize free access via deceptive data grabs.

This means trusting your users to put their faith in you because your business provides an innovative and honest service that people care about.

It also means rearchitecting systems to bake in privacy by design. Blockchain-based micro-payments may offer one way of opening up usage-based revenue streams that can offer an alternative or supplement to ads.

Where ad tech is concerned, there are also some interesting projects being worked on — such as the blockchain-based Brave browser which is aiming to build an ad targeting system that does local, on-device targeting (only needing to know the user’s language and a broad-brush regional location), rather than the current, cloud-based ad exchange model that’s built atop mass surveillance.

Technologists are often proud of their engineering ingenuity. But if all goes to plan, they’ll have lots more opportunities to crow about what they’ve built in future — because they won’t be too embarrassed to talk about it.

Social – TechCrunch