Is it right for business to be making the rules for the AI industry?


This topic contains 7 replies, has 7 voices, and was last updated by  Paul 1 year, 9 months ago.

  • Author
  • #10975

    And do they have our best interests at heart?

    • This topic was modified 1 year, 9 months ago by  Oliver.
    • This topic was modified 1 year, 9 months ago by  Oliver.

    Businesses are inherently going to have their profits at heart, although this isn’t necessarily at odds with the best interests of the public. Look at the number of AI startups in the healthcare space for example. They all want to make a profit, but help society at the same time. What is important is oversight, and I think the best way to achieve this is for companies to open source their code, which is something that Elon Musk’s company OpenAI is pushing for.


    So it sounds like the alternatives are (1) the government steps in to regulate AI development, or (2) a neutral third party advises on how development is to proceed.

    Some problems:
    (1) Most people in government are not likely up to speed on the state of AI today, and what AI may soon be capable of. Many people in government today still don’t seem to understand how the internet works, or the importance of policies like Net Neutrality.

    (2) To continue the internet example, organizations like the EFF have advocated for important policies such as Net Neutrality. OpenAI could be such an organization for the AI industry, making important technologies open source, and bringing together experts to decide on policies and standards. The question becomes, will businesses follow OpenAI’s recommendations? Who enforces them? What happens when a self-driving pickup truck by some automaker promises an AI that drives “10% more aggressively to get you there 5% faster,” overriding recommendations for safe autonomous driving?

    Eventually, I see governments having to step in to set regulations and standards. Will they be able to keep up with new developments? Will their regulations overreach and hold back research and development?

    This gives me hope: The internet is largely governed by various non-governmental and volunteer organizations such ICANN, ISOC, IETF, and others. The fact that the internet works as well as it goes, with governments, interconnected regional entities, and non-profit organizations working together is something of a miracle to me. My hope is that one day AI will be similarly governed.


    Having worked with government departments in the UK it is clear they are very behind in terms of technological understanding, although there are a small number of individuals trying to rectify this. They frequently make unscientific, non-evidence based policy decisions (e.g. funding homeopathy through the NHS, a restrictive stance on GM crops, and the crilminilisation of recreational drug use, to name but a few) so I have little confidence in their capability to regulate the AI industry. A non-governmental organisation, as you suggested @mikhail, is the likely direction in my opinion.


    @mikhail, your example of an automaker promising a certain style of driving is spot on, that is totally going to happen! However, I don’t see how governments are going to help prevent that. I mean, look at how in the pockets of other industries like tobacco and banking they are. If there’s a lot of money in it then big business will take advantage of it irrespective of how harmful it could turn out to be (ca. climate change). I think it is safer in the hands on the open source community, and possibly start-ups/small companies that are heavily involved with the open source community.


    I think this topic involves everyone, governments, organisations both for-profit and non-profit and individuals. As such, a body that represents all sounds more appropriate. This would also ensure that no one has monopoly over the topic and conflicts of interest would be identified and (hopefully) addressed transparently.


    @jaffar, whilst I agree that your suggestion is optimal, I think by the time all of those various entities figure out who makes the rules and what those rules are, AI is going to be making its own rules. Someone needs to take the reigns, like Elon Musk is trying to do with OpenAI.


    I agree with @jaffar, I think it is going to, and indeed needs to be a collaboration between businesses, governments, non-profits etc. @robbo, I have a bit more faith than you that this is achievable!

    @mikhail, I think you’re being a little harsh on the government there! Having also working in/with numerous government departments in the UK there is currently a big push to become more data driven, and understanding AI is part of that. Whether they choose to act appropriately or not is another matter!

Viewing 8 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic.

Log in with your credentials


Forgot your details?

Create Account