purpose

The Importance of Having a Purpose in Digital Transformation and Innovation

——-
Dear Feedburner Readers, we’ll soon be shutting down our feedburner plugin. If you’d still like to receive news from Brian Solis, please scroll to the bottom of www.briansolis.com and subscribe to the newsletter.
——-

So many companies are investing in digital transformation and corporate innovation strategies to compete for the future.  They’re typically led by technology and aimed at growth areas such as customer experience (CX). In actuality, these efforts are most often not that innovative…they’re a bit more iterative than groundbreaking.  That’s ok, but these times necessitate a balance of innovation and iteration.

Many organizations are simply investing in new technologies and expertise to simply modernize legacy models, processes and systems. However, to compete for the future, to earn market relevance, modernization is just not good enough. You have to give innovation a purpose. You have to give meaning to your work so that the organization can stand behind something that’s meaningful and believable. Technology isn’t the answer. It’s an essential enabler of a greater vision and mission.

What’s your purpose?

Credit: DrivingSales | DSES

Brian Solis

Brian Solis is principal analyst and futurist at Altimeter, the digital analyst group at Prophet, Brian is world renowned keynote speaker and 7x best-selling author. His latest book, X: Where Business Meets Design, explores the future of brand and customer engagement through experience design.

Invite him to speak at your event or bring him in to inspire and change executive mindsets.

Connect with Brian!

Twitter: @briansolis
Facebook: TheBrianSolis
LinkedIn: BrianSolis
Instagram: BrianSolis
Youtube: BrianSolisTV

The post The Importance of Having a Purpose in Digital Transformation and Innovation appeared first on Brian Solis.


Brian Solis

Community Plus Purpose Equals Business Advantage

Community Plus Purpose Equals Business Advantage

A well-managed online community can be a business advantage.

That’s not just something I dreamed up, it’s the conclusion of a study entitled, “What Creates Advantage in the ‘Social Era’?” In this 2015 study, authors Tim Kastelle, Nilofer Merchant, and Martie Verreynne set out to discover what will provide an advantage to businesses in the era of the Internet.

In previous eras, simply getting your strategy right created an advantage. In the Social Era, though, honing your strategy right only moves you from the ‘No Advantage’ region of the map into the first or second row. What causes the big jump in performance, that extra 30% increase or more, is combining community (ideas of many) and purpose (yet aligned to your mission). That’s the recipe for the giant leap to the top of the map.

Click to read the full study: What Creates Advantage in the ‘Social Era?’

The research reviews how, over time, what was once necessary to gain advantage becomes “table stakes.” It’s no longer enough to have the lowest price, best product, or access to capital. It’s not enough to have a unique niche. Those things are just necessary to survive. In order to thrive, businesses must figure out how community can be applied to their unique strategy.

Technological change has made collaboration a necessary part of economic production. And those businesses that can apply collaboration thoughtfully will gain advantage.

Key Takeaways for your Community

  • Treat your customers and/or employees as co-creators of value within your organization, not just value extractors
  • Celebrate unique points of view and outside ideas
  • Clarify your business mission, and then use that as a controlling force for your community strategy
  • Ensure that your community guidelines are transparent
  • Recognize that when your community invests in an idea, it co-owns its success

Can you think of other ways you can weave purpose into your community strategy?

This blog post was originally published by Rosemary O’Neill on Social Strata 

The post Community Plus Purpose Equals Business Advantage appeared first on Social Media Explorer.


Social Media Explorer

Redesigning with a Purpose in Mind

We’re so constantly bombarded with stimuli that our brains are built to weed out all but the most urgent of incoming material. So for the first issue in our third decade of publishing, we decided to shake things up: Our brand mark’s design, while attractive, hasn’t matched our messaging, and it was time to set that straight.

Brand Packaging’s RSS Feed

Breaking the Recursive Loop: Why Purpose Makes Great Programmers

One of the easiest ways to make money these days is to get into the digital world. Programmers are all the rage, with the world becoming even more dependent on technology and interconnectedness with each passing day.

Becoming a programmer really isn’t as hard as it seems, nor as hard as it was two to three decades ago. You don’t even need a degree in computer science to become an expert at C++ or PHP – all the resources you need are readily available and just within your reach.

Pick a programming language, grab a copy of “ for Dummies” from your local bookstore, sit in front of a computer, and start typing away. In fact, there are tons of online courses out there that give you enough knowledge to start working on your own programming projects.

However, if you want to become a great programmer, that’s a totally different story, and it requires one crucial ingredient: A purpose.

More than doing it simply to profit from addressing other people’s needs, great programmers are driven to excellence by their desire to provide a digital solution to persistent problems, either by resolving issues or making processes easier. Great programmers take the time to learn programming not as an easy way to earn money, but as a way for developing a concrete and reliable solution for a puzzle or problem – whether this means creating applications, upgrading systems, or creating entirely new programming languages, the greatest programmers are the people who took one look at their lives and decided to stop complaining and start working.

Here are some examples of the world’s greatest programmers (in no particular order). Through their persistence and steadfast refusal to give up in the face of obstacles, these coding masters have left a lasting digital mark on the world.

1. Alan Turing

Alan Turing
Image Credit: Wikipedia

One could say that Alan Turing was way ahead of his time. Ever since he was a young boy, Turing had a difficult time trying to adjust to the world around him – or rather, perhaps the world had a difficult time keeping up with him. Deeply in love with the sciences, Turing took great joy in absorbing and mastering advanced science and mathematical concepts at such an early age, despite the disapproval of his peers and even his teachers at Sherborne School.

Due to Alan Turing’s obsession with mathematics, science, and cryptography, he was tasked with leading the cracking of Enigma, a machine that sends and receives ciphered messages between the Germans during World War II. The way it worked was special in a way that each time a letter is pressed, the mechanical parts of the machine changed in a way that the next time the same letter is pressed it produced a different encrypted text. Meaning, no same “algorithm” can decrypt the next encrypted messages. And even before cryptologists fully decipher each message, it’s already been several hours, which makes the intel moot.

It was his love for cryptography and mathematics that would lead to Turing taking up a key role in World War II: Becoming instrumental in cracking the secrets of the Enigma machine. Even though his automatic machine (a-machine) was deemed impossible by his superiors and peers, he continued to work on it for months. A machine that could think for itself that eventually would crack every message the Enigma machine sent in real-time.

2. Ada Lovelace

Ada Lovelace
Image credit: Wikipedia

Ada Byron, the countess of Lovelace, is known as the founder of scientific computing. Mathematics played a significant role in her life, even at an early age: from developing plans for a flying machine at the tender age of 13.

Eventually, her fascination with machines led her to be deeply intrigued by Charles Babbage’s Analytical Engine, whose design resembles the basic elements of a modern computer.

In 1842, a hundred years before the first computer was made, Lovelace translated an article by the mathematician Luigi Menabrea which described Charles Babbage’s Analytical Engine. Because of Lovelace’s genius, Babbage asked her to expand the article, which resulted in a document three times the length of the original. According to Babbage, Lovelace was “the Enchantress of Numbers.”

Unfortunately, due to lack of funding, the Analytical Engine remained only a concept. This was until the 1940s when Alan Turing used her notes which brought the world the first computer.

In her notes, Lovelace believed that the Analytical Engine was a general-purpose computing machine, and that it would lead to many developments in the future, which also inspired and empowered women to enter the STEM fields (science, technology, engineering, math).

3. Rear Admiral Grace Murray Hopper

Admiral Grace Hopper
Image credit: Wikipedia

Grace Hopper was an officer in the United States Navy from 1943 up until 1986. She was also a computer scientist who was responsible for the creation of the first compiler for programming languages, which was called the A-O. For the uninitiated, a compiler transforms source code into a working program.

According to Hopper, nobody believed that she had an operational compiler because according to the doubters, “computers could only do arithmetic.” Because of her compiler, the world of programming became more efficient and faster.

Fortunatley for the world, she also invented COBOL (common business-oriented language), a programming language designed for business use. The fascinating part? COBOL programs are still being used by governments and businesses to this day.

4. Tim Berners-Lee

Tim Berners-Lee
Image credit: Wikipedia

To put it succinctly, Sir Timothy John “Tim” Berners-Lee is the reason why you can read this article. Hailing from Britain, Berners-Lee is credited with the creation of the World Wide Web; after making a proposal in March 1989 for a system of information management, he was able to administer the first successful communication between a Hypertext Transfer Protocol (HTTP) client and server through the Internet in November of the same year. Currently serving as the director of the World Wide Web Consortium (W3C), Berners-Lee has been awarded time and again for his contributions to the field and to the world.

5. Linus Trovalds

Linus Trovolds
Image credit: Wikipedia

It is quite difficult to overstate the importance of Linux, possibly the world’s most widely used open-source software. Linus Trovalds is the brains behind the creation of Linux; he came up with the idea of developing his own kernel due to his frustration at the inavailability of the GNU kernel in 1991 due to licensing issues. Initially developed as a free operating system for use on personal computers, Linux has been ported to multiple hardware platforms, eventually making it possible for programmers to develop Android, a popular operating system for smart devices.

6. Jeff Dean

Jeff Dean
Image credit: Google+

Jeff Dean is the person everyone should thank for Google search’s smoothness. He is the guy responsible for Google search indexing, which everyone in the world now thoroughly enjoys. With just a few keystrokes and a click, in just a second thousands of results can be displayed. Without his brains, other search engines could have overtaken Google a long time ago. His fame among Googlers and ex-Googlers is at Chuck Norris’ level.

7. Dennis Ritchie

Dennis Ritchie
Image credit: (Right) Wikipedia

The creator of the C programming language, the American computer programmer Dennis Richie is considered to be one of the digital era’s foremost pioneers. Intended to encourage structured programming – an approach towards breaking down what would be large applications into smaller components that are easier to manage – the C language was in turn used to create the UNIX operating system, particularly to aid in making it portable. C has since been used in virtually everything from operating systems to software applications, serving as a guide for many modern programming languages.

8. Steve Wozniak

Steve Wozniak
Image credit: Wikipedia http://en.wikipedia.org/wiki/Steve_Wozniak

Unknown to many, there are two Steves that co-founded Apple Inc. Steve Jobs, who everyone knows and Steve Wozniak, who is probably one of the most brilliant programmers to have ever lived. Together with Jobs (who didn’t code) and Ronald Wayne (a co-founder), Wozniak developed Apple I, then he further developed Apple II, which made Apple one of the leading figures in the world of microcomputing. Some would even argue that Steve Wozniak was responsible for the personal computer revolution.

9. Bill Gates

Bill Gates
Image credit: Wikipedia

Who doesn’t know Bill Gates? He was solely responsible for the tech empire Microsoft. Before he even got famous as a philanthropist, Bill Gates was a monster in terms of coding. His attention to detail and discipline in coding reflected his high intellect, which resulted in where Microsoft is now. His goal was to put Microsoft’s software into every computer, and he did exactly that, which helped a lot of people and companies.

10. Ward Cunningham

Ward Cunningham
Image credit: Wikipedia

Ward Cunningham is the person responsible for developing the first wiki. He is the person that every student and researcher should thank, because without him Wikipedia would probably not have been born.

Cunningham was asked in a 2006 interview whether he ever thought of patenting his wiki idea, especially since it became popular, he responded that the idea “just sounded like something that no one would want to pay money for.”

In the computing world, Cunningham is well known as someone who shares his ideas openly, wiki being the prime example, and many on the files of software design patterns.

To End

The thing is, you don’t have to build an entire programming language or start a multi-billion dollar company to become a great programmer. Being a great programmer lies in the intent and its rippling effects, not your mastery of a specific language.

These people found something that needed to be fixed and acted on it. They didn’t learn how to program just because it would further their careers. They did it in order to solve a problem. And being a programmer wasn’t simply a job to them, but a lifestyle. And they have inspired people, too.

There are hundreds, even thousands, of great programmers out there that didn’t make our list. For you, who is the greatest programmer? Don’t forget to comment below!


Onextrapixel – Web Design and Development Online Magazine

Exploring Creativity In-Depth: The Practical Purpose of Creativity

Figuring out the true meaning and purpose of creativity is a rather complicated subject. Sure, dictionaries and teachers can come up with a definition for it in a jiffy, but creativity as a whole can be approached in different ways. Here’s one way of looking at creativity that you probably haven’t heard yet: Without creativity, none of us would be here today. We need creativity to survive – we always have.

Hold on. You might be wondering – how can something that delves deeply into the subjective be considered an integral part of existence? You may even argue that creativity doesn’t seem to be needed for basic actions like eating, sleeping, or walking. To fully understand this new “theory” though – the theory of creativity you probably haven’t heard about – let’s first take a look at what experts and scientists believe about creativity.

How, Why and When We Are Creative

Theories on Creativity

Meeting the Borg Queen
Image credit: Eddi van W.

There are a number of major theories on creativity, and while they may not exactly contradict each other, they all offer unique – and sometimes radical – views on how and why people are able to harness the power of their imagination.

For starters, the psychoanalytical theory of creativity proposes that creativity is brought about by repressed emotions or oppressive circumstances. Basically, it says that creativity is born out of negative feelings and experiences as a coping mechanism of sorts.

Meanwhile, the mental illness theory suggests that the presence of any sort of mental illness – whether mild or severe – is a prerequisite for creativity. This isn’t necessarily true, some argue, as mental illnesses are seen as barriers that hinder rather than encourage creativity.

The relative theory of psychoticism takes it a step further, and in the opposite direction: According to the theory, all creatives have a disposition for psychotic tendencies – the very same tendencies that serve as the foundation for creative personalities. Recent studies, however, suggest that it’s actually more likely for creatives to be related to people with mental disorders than to actually suffer from them.

Another theory, the succinctly-named addiction theory, claims that substance use and addiction contributes to – and may even cause – creativity. Of course, the case could be made that creativity was inspired by the catalyst for the addiction (ex. a major emotional problem that led to drug use) and not the addiction itself. Some even believe it’s the other way around – creatives just get hooked much quicker.

However, the most widely-accepted theory is the humanistic theory. According to the theory, creativity occurs when human beings meet their six basic needs; after this, the state of self-actualization is reached, which frees us from distractions and allows us to truly set our creative minds loose.

Can Animals Be Creative, Too?

Painter
Image credit: Pierre Pouliquin

Interestingly, evidence suggests that creativity is not a characteristic solely evident in human beings.

Orangutans, elephants, crows, octopi and even rodents have been known and observed to use tools – a concept that scientists once thought was uniquely human. That’s not all; these animals have also been known to produce what appear to be works of art. Chimpanzees finger-paint, for example, while bees build beautifully symmetrical homes. It can even be argued that whale songs may actually be artistic in nature as well.

One of the most interesting examples of creativity becoming evident in animals is the story of Imo, a female macaque who was part of a group studied by primatologists on an island off the coast of Japan in 1953. Imo was observed carrying a dirty sweet potato – which the scientists were offering the animals to make them more welcoming – and washing it in a freshwater stream before eating it. This was behavior that no primatologist had observed before – an “effective novelty” that the rest of her family soon adopted. This behavior outlived even her, as the macaques off the coast of Japan continued to wash their sweet potatoes fifty years later, even long after Imo and her generation have passed away.

If there’s one thing that this proves, it’s that you don’t need to be as smart as a human in order to be creative. Which leads us to the next question – exactly how smart are creatives, then?

Creativity and Intelligence

Child Head
Image credit: Charly W Karl

Despite what many people seem to think, you don’t have to be a genius to be creative. As a matter of fact, while your intelligence quotient (IQ) does have something to do with creative thinking, it is more strongly connected to the “thinking” part than the “creative” part.

Consider this: The main method of measuring your IQ depends on your ability to absorb and process information, as well as being able to use the data on hand to come up with a solution to a specific problem, regardless of whatever external factors may be present at the moment. This means that while intelligence does have a relationship with creativity, it plays a more significant role in understanding the creative process and making some improvements, not actually kickstarting it.

The fundamental difference between intelligence and creativity is that the former is focused on information-gathering and usage, while the latter transcends the boundaries of the intellect and explores possibilities by connecting concepts and ideas that may or may not be related at first.

Truth be told, people like Lord Byron or Leonardo da Vinci probably didn’t need to be smarter than the rest of us to develop their creativity. They just have a different approach to doing things that might be good to adopt in your day-to-day routine – one that can be summarized in three steps.

  1. Make the most out of life by mixing the right amount of caution and being adventurous.
  2. Ponder your experiences and try to learn as much as you can from both victories and mistakes.
  3. Don’t be afraid to give your ideas a shot.

Innovation: The Practical (and Evolutionary) Purpose of Creativity

Now that the basics of creativity have been sufficiently tackled, we can move on to The Big Question: Why, then, is creativity necessary for survival?

Answer: It’s because of one simple word – “innovation.”

“Creativity” and “innovation” are two terms that are sometimes taken to mean the same thing. That’s not surprising, considering that both involve thinking outside the box, either to improve upon existing products, concepts, and processes or to build completely new ones from scratch.

However, there are a few key differences between creativity and innovation.

Creativity is difficult to measure, as it pertains to thoughts – the products of one’s imagination that are not necessarily (and more often than not, aren’t) tangible. On the other hand, innovation, which involves putting new and improved ideas into action, can be measured. When one introduces a new step in a process to make it more efficient, addresses a previously unidentified need, or introduces an entirely new product or procedure, that is creativity in action – that is innovation.

As author Shawn Hunter puts it: “Creativity isn’t necessarily innovation. If you have a brainstorm meeting and dream up dozens of new ideas, then you have displayed creativity, but there is no innovation until something gets implemented.”

Without creativity, there can be no innovation. Innovation is to creativity as technology is to science – the application of ideas to either provide solutions to our biggest problems or improve our quality of life. In order for humanity to survive over millennia, it had to learn how to adapt and evolve. Our ancestors needed to learn how to build shelters, make fire, figure out how to get food, and so on. Without creativity, we wouldn’t have automobiles, or medicine, or cellphones, or the Internet. We probably wouldn’t have survived as a species as long as we did if our forefathers didn’t put their ideas into action.

Conclusion

So take a break from your routine today. Think of a new idea. Or just daydream. Who knows – those fanciful little thoughts swimming in your head right now might be the key to solving tomorrow’s problems.


Onextrapixel – Web Design and Development Online Magazine

Did Penguin Just Design the Worst Book Cover of All Time on Purpose?

The 50th anniversary publication of Roald Dahl's children's classic Charlie and the Chocolate Factory is leaving a bad taste in some mouths.

Controversy surrounds the cover of the Penguin Modern Classics edition, which eschews Willy Wonka's fanciful factory, golden tickets, Oompa-Loompas and other familiar story elements. Instead, we get a stylized image of a young girl, quaffed to the hilt in colorful bows and silks, sitting in her mother's lap.

Detractors are denouncing the shot for sexualizing kids, and they deride its sleazy '60s vibe as inappropriate for a story geared toward young people. They have a valid point, though in fairness, the broader meaning of the image is open to all sorts of interpretations. (It's not overtly sexual. I mean, we don't see Wonka's willy, thank goodness.)

The picture is a cropped version of a photo used in a 2008 fashion magazine feature (see below) completely unrelated to Dahl and the book in question. According to the publisher, the cover "looks at the children at the center of the story, and highlights the way Roald Dahl's writing manages to embrace both the light and the dark aspects of life."

The tale does mix in stronger themes about child-parent relationships and manipulation (memorably personified by bratty schoolgirl Veruca Salt). Still, that's hardly the book's primary focus, and it's tempting to dismiss Penguin's explanation as candy coating for a publicity ploy designed to drive debate and sell copies. (The publisher certainly seems to be relishing the attention.)

Among the public, bitter reactions outweigh the sweet, with most reasoned negative opinions running along the lines of this comment in Creative Review: "It seems a bit misleading, doesn't it? If I knew nothing about the book, this cover would suggest to me that it's a really disturbing story for adults, probably a thriller about young girls in the beauty industry."

The most deliciously snarky critique comes from the Guardian, which calls it the worst cover of all time, grousing that the image "reimagines Dahl's classic as 1960s Wyndhamesque horror, robotic alien children stranded in a stark asylum."

Here's the original photo, published in Numéro magazine in 2008. Image via.





Adweek : Advertising & Branding