Corporate activism takes on precarious role
Microsoft president says it can’t be a reaction to social media pressure
More like this
Whether we like it or not, the public is increasingly turning not to their elected officials, but to the heads of major corporations for leadership on important and difficult issues. Perhaps because of partisan gridlock, perhaps because politicians seem to pay more attention when Big Business talks, rightly or wrongly, people expect today’s CEOs to pick up the ball.
That’s both good and bad for society, according to Brad Smith, Microsoft’s president and chief legal officer.
“I think it’s not only good, but fundamentally important that companies have a conscience,” he said during a talk with Harvard Business Review editor Adi Ignatius last Thursday about the rise in corporate activism. The discussion was part of HUBweek, the innovation and ideas festival launched in 2014 by Harvard University, The Boston Globe, Massachusetts Institute of Technology, and Massachusetts General Hospital.
People, especially younger workers, want to work for businesses that operate conscientiously, and as companies expand globally, particularly those in creative and intellectual property sectors, it becomes not just a nice thing to do, but an imperative for corporate survival. “You better have a conscience,” Smith said.
Nike’s new endorsement contract with Colin Kaepernick, the former NFL star who protested police violence against African-Americans, and retailer Dick’s Sporting Goods’ decision to stop selling assault-style rifles shortly after the fatal shooting of 17 students at Marjory Stoneman Douglas High School in Parkland, Fla., this past February are examples of high-stakes corporate activism that benefited the bottom line.
But there’s a downside, and things have gotten “a little out of balance at the moment,” Smith said. Corporate executives are not elected representatives and therefore shouldn’t wield vast social or political influence, nor should corporations make decisions about whether they will be regulated. That’s still the job of Congress, the president, and other government officials, he said.
How and when companies will use their voice to take a stand on an issue requires careful consideration, he said. It cannot be only in reaction to social media pressure.
“Few questions require more thought … than [the] question of when do you use the company’s voice and when do you not,” Smith said.
For Microsoft, that means responding if the issue is important to the business, important to customers, and, increasingly, important to its employees both at and outside of work.
On looming issues like the impact and ethics of artificial intelligence (AI), Smith said the technology industry has to play a constructive role in anticipating and addressing the direct and indirect spillover effects it will have on the workforce and on society. Because “every ethical issue in the history of humanity is now an ethical issue for computers” in computer decision-making, exploring potential unintended consequences of AI will require an interdisciplinary approach that connects computer science, STEM, engineering, and — surprise — the liberal arts to get through what are bound to be “some rocky times” in the coming decade.
Much like the long-ago debate over cameras in the public sphere, Smith said that with innovations like facial-recognition software, companies need to think about the privacy implications of what they’re developing.
“So, I think we have to ask ourselves, as people did a century or more ago, what kind of world do we want to live in, what kind of society do we want to have, where do we want this technology to be used, and how and where do we say no, we don’t feel comfortable having it used that way.”