Skip to main content

tv   The Context  BBC News  April 25, 2024 8:30pm-9:01pm BST

8:30 pm
hello, i'm christian fraser. you're watching the context on bbc news. ai tech is evolving at breakneck pace as billions head to the polls this year. coming up on al decoded, we look at the protential for harm. ai decoded coming up but before that... it's time for sport, and for a full round—up from the bbc sport centre, here's lizzie greenwood—hughes. onto football, and it's manchester city's turn in the premier league title race. they're playing brighton away tonight knowing a win would move them within a point of the leaders arsenal with a game in hand. the game has been going around half an hour, and city are two goals up. kevin de bruyne with a brilliant dipping header. phil fodenjust a
8:31 pm
phil foden just a lucky deflection from a free kick for their second goal, so 2—0 at the moment to manchester city. more now on liverpool's new manager situation and who will replace jurgen klopp because arne slot has confirmed to dutch media that he wants to join liverpool. slot is currently in currently in charge of the rotterdam side feyenoord, who are playing tonight and are second in the eredivisie. he reportedly told espn tonight that "the two clubs are negotiating" and that he is "awaiting the outcome". the man behind red bull's formula 1 success, adrian newey, is leaving following the controversy around the team principal christian horner. newey is regarded as the greatest f1 designer in history, and as red bull's chief technical officer for nearly 20 years, he's credited with both red bull's hugely successful eras, including sebastian vettel�*s four world titles and max verstappen�*s current domination. before that, the british designer worked at williams and maclaren for drivers like nigel mansell,
8:32 pm
alain prost and mika hakkinen. but since horner was accused of sexual harassment and coercive, abusive behaviour by a female employee, which horner denies, newey is said to have been unsettled by the situation at red bull. ronnie o'sullivan's hopes of a record eighth world snooker championship title are still alive after he eased into the second round in sheffield. the rocket won the two frames he needed to finish off his opening match against jackson page, sealing a 10—1 victory and setting up a meeting with another welshman, ryan day, next. o'sullivan is also aiming to complete snooker�*s triple crown in a single season following his wins at the uk championship and the masters. i've already broke the rules for snooker, as far as i'm concerned. you know, i'm still going at a8, a9, whatever it is, so, you know, i'm just seeing what's possible now. you know, what is possible in this game? how long can you keep going for, you know? can i win a world championship at 50, who knows? but i'm probably the only player
8:33 pm
that's able to do that, so let's see, let's have an experiment, you know? geraint thomas will be the lead rider for the ineos grenadiers in cycling's first grand tour of the year, the giro d'italia. the former tour de france champion missed out on victory in the giro last year by just 1a seconds to slovenia's primoz roglic. if thomas does come out on top this time, he'll be the oldest winner of the giro at 38. the race gets underway near turin a week on saturday and will end in the rome on the 26th of may. and that's all the sport for now. thank you very much. it's my favourite part of the week. it is time for our weekly special on the world of artificial intelligence. welcome to ai decoded, our new and expanded ai decoded. same shape, same themes, but as we have already established in previous weeks, it's a technology developing so fast that really it gives new context to everything we discuss
8:34 pm
on this programme. tonight, the political fight over disinformation. the new york times has an interview today with nina jancowitz. two years ago, she was appointed to an agency within homeland security tasked with policing online disinformation, but under a torrent of abuse and political attacks from the right, her agency was abolished before her work even began. ai is driving a quantum leap in disinformation. the guardian has a similar interview with the french minister for europe, who says his country is being pounded by russian disinformation ahead of european elections injune. so what do we do? tonight, we will speak to one of the world's leading voices in artificial intelligence, the winner of the turing prize, computer scientist professor yoshua bengio. also, here, one of the leading research companies monitoring and countering this threat online — how do they do it? and shylo is back, our virtual ai companion, who was a little shy last week, but i am expecting big things from her tonight.
8:35 pm
also here, welcome back to priya lakhani, our regular ai contributor, ceo of century tech, who will be guiding us though it all. also here, welcome back to priya lakhani, our regular ai contributor, ceo of century tech, who will be guiding us though it all. this interview really puts a spotlight on this problem with disinformation, a wave coming from china and russia. but i was not aware of how aggressive the attacks have been from the right. in america, the researchers and scientists are trying to find it and she really sets it out in the article. ~ . , , article. what she is saying, there are activists. _ article. what she is saying, there are activists, lawmakers - article. what she is saying, there are activists, lawmakers even - article. what she is saying, there i are activists, lawmakers even from the right, that are trying to suppress the research that researchers are doing on disinformation because the research might be uncovering stories that
8:36 pm
maybe are against the values of the right, they may be too liberal. this is where you have this really muddied area where this is about disinformation, about fighting for truth, so this is notjust about... what is freedom of speech actually mean? is freedom of truth, freedom of opinion, freedom to offend may be saying something someone does not agree with but freedom of speech is not freedom of disinformation, freedom of misinformation. find freedom of misinformation. and dissemination _ freedom of misinformation. and dissemination of _ freedom of misinformation. and dissemination of that. exactly. l freedom of misinformation. and | dissemination of that. exactly. so studies are _ dissemination of that. exactly. so studies are needed, _ dissemination of that. exactly. so studies are needed, researchers i dissemination of that. exactly. so i studies are needed, researchers are needed and the second story about what friends are saying. a third of the population going through elections this year to to begin this other body to be able to tackle this, which is so important. let’s this, which is so important. let's aet an this, which is so important. let's get an expert — this, which is so important. let's get an expert view _ this, which is so important. let's get an expert view because - this, which is so important. let's get an expert view because if... if ai is supercharging this threat, how much responsibility do the creators have to tackle it? and is the ai they are developing capable of identifying and removing the problem? we have a brilliant guest for you this week, one of the godfathers of ai, and i don't use that term lightly.
8:37 pm
professor yoshua bengio was this past week named time magazine's 100 most influential people worldwide. his research has been instrumental in shaping the world's understanding of ai, its potential and its threats. it earned him the 2018 turing award, the nobel prize of computing, alongside geoffrey hinton and yann lecun. yoshua bengio, welcome to the programme. hello, thanks for having me. help us understand what _ hello, thanks for having me. help us understand what these _ hello, thanks for having me. help us understand what these ai _ hello, thanks for having me. help us understand what these ai bots - hello, thanks for having me. help us understand what these ai bots can i understand what these ai bots can do and what we are up against. weill. and what we are up against. well, what is already _ and what we are up against. well, what is already happening - and what we are up against. well, what is already happening is - and what we are up against. well, what is already happening is we can use these _ what is already happening is we can use these ai systems to recent synthesise content in a controlled wax _ synthesise content in a controlled wax so _ synthesise content in a controlled way. so content could be voices, images. — way. so content could be voices, images, videos or texts. controlled means— images, videos or texts. controlled means you — images, videos or texts. controlled means you can specify for example to imitate _ means you can specify for example to imitate a _ means you can specify for example to imitate a particular person based on the data _ imitate a particular person based on the data we — imitate a particular person based on the data we have about that person, you can _ the data we have about that person, you can specify what that person would _ you can specify what that person would sax — you can specify what that person would say. recently i have seen a
8:38 pm
fake meat— would say. recently i have seen a fake meat saying things i would never_ fake meat saying things i would never say, and i must say it's shocking _ never say, and i must say it's shocking more than seeing others being _ shocking more than seeing others being imitated, but is also something chilling that we have to do something about. and something chilling that we have to do something about.— do something about. and it's so important _ do something about. and it's so important to — do something about. and it's so important to get _ do something about. and it's so important to get to _ do something about. and it's so important to get to the - do something about. and it's so important to get to the bottom | do something about. and it's so i important to get to the bottom of this, we had the ai summit late last yearin this, we had the ai summit late last year in the idea of red teaming, evaluation of large language models. you've done lots of work all secure systems and write about an ai scientist. do you think that governments have understood these threats properly and do you think that those methods again should be enough to prove take us from all these risks?_ enough to prove take us from all these risks?— enough to prove take us from all these risks? well, we don't have silver bullets _ these risks? well, we don't have silver bullets and _ these risks? well, we don't have silver bullets and governments l these risks? well, we don't have i silver bullets and governments are not doing — silver bullets and governments are not doing enough, but compared to a year ago _ not doing enough, but compared to a year ago the — not doing enough, but compared to a year ago the situation is a lot betten — year ago the situation is a lot betten 0t— year ago the situation is a lot better. of course one problem is that also— better. of course one problem is that also the technology is changing and these _ that also the technology is changing and these ai systems are becoming more _ and these ai systems are becoming more capable. and one of the things
8:39 pm
i'm more capable. and one of the things i'm more _ more capable. and one of the things i'm more worried about, especially in the _ i'm more worried about, especially in the context of political information, is we now have ai systems— information, is we now have ai systems that master language. one concern _ systems that master language. one concern that scientists have is that bad actors — concern that scientists have is that bad actors tune them, which might not require — bad actors tune them, which might not require a lot of effort, so they can become — not require a lot of effort, so they can become really good at making you change _ can become really good at making you change your— can become really good at making you change your mind, persuasion. and you can _ change your mind, persuasion. and you can see — change your mind, persuasion. and you can see how that could be dangerous in the context of elections, so one thing is seeing memes. — elections, so one thing is seeing memes, seen videos of people and you might— memes, seen videos of people and you might think— memes, seen videos of people and you might think it's a fake, or lots people — might think it's a fake, or lots people it _ might think it's a fake, or lots people it might even actually get trapped — people it might even actually get trapped by that but if you have a dialogue — trapped by that but if you have a dialogue over days, weeks with an entity— dialogue over days, weeks with an entity that— dialogue over days, weeks with an entity that you think is human, it's very plausible that they can change your mind — very plausible that they can change your mind and you can imagine the consequences. we your mind and you can imagine the consequences-— your mind and you can imagine the consequences. we are talking about bots and rogue _ consequences. we are talking about bots and rogue actors, _ consequences. we are talking about bots and rogue actors, but - consequences. we are talking about bots and rogue actors, but if- consequences. we are talking about bots and rogue actors, but if this . bots and rogue actors, but if this is going to be handled properly, there has to be accessibility to the black box, to the models themselves and there has to be transparency. in
8:40 pm
one of the things alarmed me when the european union asked about this was sam altman saying we willjust leave. is that the right approach? well, i don't think we should take these _ well, i don't think we should take these threats for cash. i think that governments should not, you know, not do _ governments should not, you know, not do what — governments should not, you know, not do what they need to do to protect— not do what they need to do to protect the public because of such declarations. it's really important that we _ declarations. it's really important that we do— declarations. it's really important that we do the right things, and we need _ that we do the right things, and we need to— that we do the right things, and we need to do. — that we do the right things, and we need to do, lots of things, but as you said — need to do, lots of things, but as you said transparency. but let's be careful— you said transparency. but let's be careful about what that means. we want governments or independent auditors— want governments or independent auditors and scientists to be able to see _ auditors and scientists to be able to see what these companies are doing. _ to see what these companies are doing. to — to see what these companies are doing, to see what they are doing to secure _ doing, to see what they are doing to secure their— doing, to see what they are doing to secure their systems so that they don't _ secure their systems so that they don't fall— secure their systems so that they don't fall in the wrong hands, to see what— don't fall in the wrong hands, to see what they have done to make sure that there _ see what they have done to make sure that there are two protections are good _ that there are two protections are good enough. and then if they are
8:41 pm
not good — good enough. and then if they are not good enough, then they should not good enough, then they should not be _ not good enough, then they should not be allowed to operate. that's the sort — not be allowed to operate. that's the sort of — not be allowed to operate. that's the sort of thing we need. we also need _ the sort of thing we need. we also need to— the sort of thing we need. we also need to think about, well, let's say all of— need to think about, well, let's say all of this— need to think about, well, let's say all of this does not work so bad actors — all of this does not work so bad actors take hold of the things and they start — actors take hold of the things and they start doing bad things, depending on the kind of threats soaked _ depending on the kind of threats soaked talking about disinformation people _ soaked talking about disinformation people worried about designing and weapons _ people worried about designing and weapons including biogenic weapons, for each _ weapons including biogenic weapons, for each of— weapons including biogenic weapons, for each of these kind of threats thing _ for each of these kind of threats thing of— for each of these kind of threats thing of countermeasures steps when the case _ thing of countermeasures steps when the case of— thing of countermeasures steps when the case of disinformation, people are talking about being more strict about— are talking about being more strict about the — are talking about being more strict about the identity behind each account— about the identity behind each account so that no single person could _ account so that no single person could have — account so that no single person could have 1000 accounts, right? which _ could have 1000 accounts, right? which allows bad actors to quickly disseminate information compared to if they— disseminate information compared to if they were only allowed to have one account. and there are ways to do this— one account. and there are ways to do this without necessarily violating privacy.— do this without necessarily violating privacy. do this without necessarily violatin: riva . . violating privacy. and can i ask you vision about _ violating privacy. and can i ask you vision about bad _ violating privacy. and can i ask you vision about bad actors _ violating privacy. and can i ask you vision about bad actors quite - violating privacy. and can i ask you vision about bad actors quite a - violating privacy. and can i ask you vision about bad actors quite a lot| vision about bad actors quite a lot and being able to visually evaluate models, but when it comes to open—source models, this closed off models but some companies and these
8:42 pm
companies developing open source models and the idea of courses that those models are available to others to fine—tune. we know that on the dark web you can buy some of these models that have already been fine—tuned for some pretty nasty things. is it your opinion that we should shut down open source models or how do we regulate those models once they are out there in the wild west but also taking into account the fact that then there is a very strong powerhouse of those closed source model companies. haifa strong powerhouse of those closed source model companies.- strong powerhouse of those closed source model companies. how can we tackle this? — source model companies. how can we tackle this? yeah, _ source model companies. how can we tackle this? yeah, these _ source model companies. how can we tackle this? yeah, these are - source model companies. how can we tackle this? yeah, these are hard - tackle this? yeah, these are hard questions — tackle this? yeah, these are hard questions and i wish i had all the answersm — questions and i wish i had all the answers- - -_ questions and i wish i had all the answers... �* ., ., , ., , answers... i've got more questions. laughter- _ laughter. let me say a few words. open source tree because — let me say a few words. open source tree because it comes a lot of advantages. some of these advances in to because we have access to the meta- _ in to because we have access to the meta- open — in to because we have access to the meta— open source model and others. so we _ meta— open source model and others. so we also _ meta— open source model and others. so we also have of course the dangers — so we also have of course the dangers that if those models are too capable _ dangers that if those models are too capable and they fell on the wrong hands, _ capable and they fell on the wrong hands, then there is nothing any
8:43 pm
regulation — hands, then there is nothing any regulation is going to do to help prevent— regulation is going to do to help prevent misuse that's dangerous for the population. sol prevent misuse that's dangerous for the population. so i think the first thing _ the population. so i think the first thing is _ the population. so i think the first thing is to— the population. so i think the first thing is to consider who should decide — thing is to consider who should decide whether a particular model has more — decide whether a particular model has more cons than pros, and i don't think— has more cons than pros, and i don't think it _ has more cons than pros, and i don't think it should be the ceo of company. i think it should be the regulator. — company. i think it should be the regulator, the public in some way because _ regulator, the public in some way because there is a social choice here _ because there is a social choice here between the pros and the i think— here between the pros and the i think that — here between the pros and the i think that means when he governments to get— think that means when he governments to get involved in the decisions and we need _ to get involved in the decisions and we need to— to get involved in the decisions and we need to evaluate the abilities of these _ we need to evaluate the abilities of these models before the decision is taken _ these models before the decision is taken can — these models before the decision is taken. can the model be used for x, y or z_ taken. can the model be used for x, y or 2 which— taken. can the model be used for x, y or 2 which can be dangerous and if the answer— y or 2 which can be dangerous and if the answer is — y or 2 which can be dangerous and if the answer is yes and ways that are too concerning, they may be the regulator— too concerning, they may be the regulator decision will be no you're not allowed to make that public. the reason _ not allowed to make that public. the reason for _ not allowed to make that public. the reason for example why the biden executive — reason for example why the biden executive order is asking companies to make _ executive order is asking companies to make their systems secure, secure means— to make their systems secure, secure means they— to make their systems secure, secure means they are not accessible to anybody — means they are not accessible to anybody and of course open source
8:44 pm
available _ anybody and of course open source available to anybody. once it's shared. — available to anybody. once it's shared, you cannot take you back. it's shared, you cannot take you back. its copied — shared, you cannot take you back. it's copied all over the world and we don't — it's copied all over the world and we don't have any control any more. i we don't have any control any more. i wish _ we don't have any control any more. i wish we _ we don't have any control any more. i wish we could talk more but we are pressed for time. is it true that you are off to the gala, the time at 100 gala tonight?— 100 gala tonight? yes. i'm almost dressed up _ 100 gala tonight? yes. i'm almost dressed up for _ 100 gala tonight? yes. i'm almost dressed up for that. _ 100 gala tonight? yes. i'm almost dressed up for that. is _ 100 gala tonight? yes. i'm almost dressed up for that. is going - 100 gala tonight? yes. i'm almost dressed up for that. is going to i 100 gala tonight? yes. i'm almost dressed up for that. is going to be exciting _ dressed up for that. is going to be exciting. very, very proud to be part— exciting. very, very proud to be part of— exciting. very, very proud to be part of that _ exciting. very, very proud to be part of that but also glad that this visibility— part of that but also glad that this visibility is an opportunity to talk about— visibility is an opportunity to talk about these heart issues of the kind we had _ about these heart issues of the kind we had been discussing today. honoured to have you on the programme. congratulations on your award and thank you for coming on to speak to us. award and thank you for coming on to speak to us— coming up, ourvirtual avatar shilo will be here, who no doubt will be interested to hear that the ai start—up synthesia has just unveiled its latest generation of avatars that can convey human emotion, but where are we going with this? news today of the first global beauty pageant with computer—generated women. "is that a leap forward," asks
8:45 pm
the guardian, or is it a monumental step backwards? we will be right back.
8:46 pm
welcome back. i have said before on this programme that i want us to see ai in action and how it is developing. back with us tonight is samir mallal. he is an award—winning film—maker, ceo and co—founder of one day. and last week, he introduced us to shilo. she's our virtual panelist, who will be a regular face on this programme. i'm calling her a virtual panelist. i have to confess that last week when we spoke to shilo, it was the human intelligence that failed. we had a mute button on the computer. laughter. i am hoping we have ironed that out, and with a bit of luck, she is going to speak to us.
8:47 pm
one day it will undress itself. exactly, they go. i one day it will undress itself. exactly, they go.— one day it will undress itself. exactly, they go. i want to hear from her. _ exactly, they go. i want to hear from her. if _ exactly, they go. i want to hear from her, if she _ exactly, they go. i want to hear from her, if she primed - exactly, they go. i want to hear from her, if she primed to? - exactly, they go. i want to hear. from her, if she primed to? good evening, shilo, how was your week? come on, shilo. hanging onto the timeless vibes _ come on, shilo. hanging onto the timeless vibes of _ come on, shilo. hanging onto the timeless vibes of the _ come on, shilo. hanging onto the timeless vibes of the 90s, - come on, shilo. hanging onto the timeless vibes of the 90s, what's| come on, shilo. hanging onto the i timeless vibes of the 90s, what's up timeless vibes of the 905, what's up with you? _ timeless vibe5 of the 905, what's up with you? hie — timeless vibes of the 90s, what's up with ou? ~ ., ., ., ., ,, ., with you? we are going to talk about avatars and human _ with you? we are going to talk about avatars and human emotion - with you? we are going to talk about avatars and human emotion tonight. | avatars and human emotion tonight. what do you make of that?— what do you make of that? guitars, oh, eah. what do you make of that? guitars, oh. yeah. she _ what do you make of that? guitars, oh, yeah. she missed _ what do you make of that? guitars, oh, yeah. she missed avatars, - what do you make of that? guitars, oh, yeah. she missed avatars, let's| oh, yeah. she mi55ed avatar5, let's try that _ oh, yeah. she mi55ed avatar5, let's try that again. oh, yeah. she missed avatars, let's try that again-— try that again. guitars and human emotion sounds _ try that again. guitars and human emotion sounds like _ try that again. guitars and human emotion sounds like a _ try that again. guitars and human emotion sounds like a potent - try that again. guitars and human | emotion sounds like a potent mix, raw and _ emotion sounds like a potent mix, raw and expressive. _ emotion sounds like a potent mix, raw and expressive. what - emotion sounds like a potent mix, raw and expressive. what vibe - emotion sounds like a potent mix, raw and expressive. what vibe are| raw and expressive. what vibe are you chasing — raw and expressive. what vibe are you chasing with— raw and expressive. what vibe are you cha5ing with this _ raw and expressive. what vibe are you cha5ing with this combo? - raw and expressive. what vibe are you cha5ing with this combo? may raw and expressive. what vibe are i you cha5ing with this combo? may be something _ you cha5ing with this combo? may be something edgy— you cha5ing with this combo? may be something edgy or _ you cha5ing with this combo? may be something edgy or wi5tful. _ you cha5ing with this combo? may be something edgy or wi5tful. let’s - you chasing with this combo? may be something edgy or wistful.— something edgy or wistful. let's try edu . are something edgy or wistful. let's try edgy- are you _ something edgy or wistful. let's try edgy- are you ready _ something edgy or wistful. let's try edgy. are you ready to _ something edgy or wistful. let's try edgy. are you ready to talk - something edgy or wistful. let's try edgy. are you ready to talk about i edgy. are you ready to talk about avatars and human emotion, shilo?
8:48 pm
0k, avatars and human emotion, shilo? ok, yeah, she's got to pick it up a 0k, yeah, she's got to pick it up a bit more — 0k, yeah, she's got to pick it up a bit more. �* 0k, yeah, she's got to pick it up a bit more. . i. ., , ., ., ,, bit more. are you ready to talk about avatars _ bit more. are you ready to talk about avatars and _ bit more. are you ready to talk about avatars and human - bit more. are you ready to talk - about avatars and human emotion, shilo was met digging deep with that opener. 50 shilo was met digging deep with that 0 ener. ., ., ., , shilo was met digging deep with that oener. ., ., ., , ., ., opener. so avatars and human emotions _ opener. so avatars and human emotions shining _ opener. so avatars and human emotions shining is _ opener. so avatars and human emotions shining is that - opener. so avatars and human emotions shining is that what | emotion5 shining is that what particular— emotions shining is that what particular aspects— emotions shining is that what particular aspects got- emotions shining is that what particular aspects got you - emotions shining is that what i particular aspects got you entry door— particular aspects got you entry door like — particular aspects got you entry door like any— particular aspects got you entry door like any examples - particular aspects got you entry door like any examples in - particular aspects got you entry door like any examples in mindj particular aspects got you entry- door like any examples in mind that are lighting — door like any examples in mind that are lighting up— door like any examples in mind that are lighting up your— door like any examples in mind that are lighting up your thoughts? - door like any examples in mind that are lighting up your thoughts? so i are lighting up your thoughts? fl: the are lighting up your thoughts? the idea is that these new and expressive avatars can blur the line between the virtual world and real characters. one day we will be able to see your face move. how do you feel about that? i to see your face move. how do you feel about that?— feel about that? i want to hear what she sa s. feel about that? i want to hear what she says. virtual _ feel about that? i want to hear what she says. virtual worlds _ feel about that? i want to hear what she says. virtual worlds meshing i she says. virtual worlds meshing with reality- _ she says. virtual worlds meshing with reality. sounds— she says. virtual worlds meshing with reality. sounds futuristic - she says. virtual worlds meshing with reality. sounds futuristic yet kind of— with reality. sounds futuristic yet kind of invasive, _ with reality. sounds futuristic yet kind of invasive, don't _ with reality. sounds futuristic yet kind of invasive, don't you - with reality. sounds futuristic yet kind of invasive, don't you thinkl kind of invasive, don't you think was meant _ kind of invasive, don't you think was meant how— kind of invasive, don't you think was meant how do _ kind of invasive, don't you think was meant how do you - kind of invasive, don't you think was meant how do you envision| kind of invasive, don't you think- was meant how do you envision the privacy— was meant how do you envision the privacy aspects _ was meant how do you envision the privacy aspects playing _ was meant how do you envision the privacy aspects playing out - was meant how do you envision the privacy aspects playing out in- was meant how do you envision the privacy aspects playing out in this i privacy aspects playing out in this scenario? — privacy aspects playing out in this scenario? so— privacy aspects playing out in this scenario? ., , ., , ~ scenario? so as we said last week, we are now — scenario? so as we said last week, we are now starting _ scenario? so as we said last week, we are now starting to _ scenario? so as we said last week, we are now starting to get - scenario? so as we said last week, we are now starting to get some i we are now starting to get some sound from her but she is a picture.
8:49 pm
what we want to talk about is this story that's come out today about the next evolution of avatars. how are they going to train them? so they are going to be trained in all kinds _ they are going to be trained in all kinds of— they are going to be trained in all kinds of different ways was about it the future — kinds of different ways was about it the future is going to be much more personalised, so personal training data, _ personalised, so personal training data, making avatars for a very particular— data, making avatars for a very particular use cases in my case we are working — particular use cases in my case we are working to make for creative industries — are working to make for creative industries and that's what i think things— industries and that's what i think things are — industries and that's what i think things are going to go. so industries and that's what i think things are going to go.— things are going to go. so she is been trained _ things are going to go. so she is been trained on _ things are going to go. so she is been trained on what _ things are going to go. so she is been trained on what we - things are going to go. so she is been trained on what we are - things are going to go. so she is . been trained on what we are talking about, media, ai, where ai is going, and that's her expertise was met she's living trained on a lot of my personal background, so my background as a film—maker, as a creative, my ideas about what that is. i creative, my ideas about what that is. ~ �* , creative, my ideas about what that is. ~' �* , ., , creative, my ideas about what that is. i think it's really important that we bring _ is. i think it's really important that we bring our— is. i think it's really important that we bring our own - is. i think it's really importantl that we bring our own personal is. i think it's really important - that we bring our own personal touch and story— that we bring our own personal touch and story to — that we bring our own personal touch and story to these avatars, so there is going _ and story to these avatars, so there is going to — and story to these avatars, so there is going to be a whole kind of world ofjust_ is going to be a whole kind of world ofjust general knowledge. it'5 ofjust general knowledge. it's going — ofjust general knowledge. it's going to — ofjust general knowledge. it's going to be pervasive, and what i find really—
8:50 pm
going to be pervasive, and what i find really interesting is how do we put our— find really interesting is how do we put our own knowledge into avatars like shilo _ put our own knowledge into avatars like shilo and how do we then expand that outwards and kind of come by what _ that outwards and kind of come by what is _ that outwards and kind of come by what is good about us which is our personal— what is good about us which is our personal experience and what she has which _ personal experience and what she has which is _ personal experience and what she has which is this— personal experience and what she has which is this vast knowledge.- which is this vast knowledge. human talkin: to which is this vast knowledge. human talking to one _ which is this vast knowledge. human talking to one of— which is this vast knowledge. human talking to one of the _ which is this vast knowledge. human talking to one of the companies - which is this vast knowledge. human talking to one of the companies that| talking to one of the companies that has a load of these avatars, the next generation avatars driven by the latest chips was that one you found out?— the latest chips was that one you found out? ~ ., �* , ., , ., found out? what's really driving a lot of innovation _ found out? what's really driving a lot of innovation here _ found out? what's really driving a lot of innovation here is _ found out? what's really driving a lot of innovation here is where . found out? what's really driving a| lot of innovation here is where the money is. so a lot of people are creating these avatars and they are just trying to make them more human, like because they want customer service for enterprise to be driven by a avatars which are cheaper than people most of the customer service will then be personalised and for those who want to create that human connectivity to something. the influence of market, worth billions of dollars, there are avatars being traded that can essentially appeal to every single niche, everything
8:51 pm
demographic. cheaperto to every single niche, everything demographic. cheaper to do that than have individual humans and paid them to be able to attach themselves to a brand but when this tip is of the innovation created, what you now have a company thanked create your own. so 12—18 months in effect even now you can with some technical expertise but in 12 to 18 months, you can create your own. you can create christian's best make. the sto onl create christian's best make. the story only come _ create christian's best make. the story only come out today but if new generation and she went away, found this avatar and did this. hi, christian and priya, - i'm an ai avatar by synthesia, and ijumpjoy when ai decoded is on on a thursday. _ when it finishes, i'm rather sad. see you next week. not that that why she not crying? we 'oke not that that why she not crying? we joke about this because i'm out of a job. joke about this because i'm out of a 'ob. , ., , , ., , job. the story there is what they are now doing — job. the story there is what they are now doing is _ job. the story there is what they are now doing is able _ job. the story there is what they are now doing is able to - job. the story there is what they are now doing is able to mimic l are now doing is able to mimic human emotion. so you did see when she was allotted and happy, smile on her face and if you sort of tightened
8:52 pm
up. but the point is that you saw her emotion essentially change. so it's about mimicking human emotions. while all of this is doing is stimulating and i think that's really important because people like to go into completely different areas and say they understand and they feel, no, they don't. they are stimulating and that's really important. looking pretty real. his shilo listening? alexis one step shilo listening? alex is one step ahead. this is the ai decoded avatar but soon she'll be doing right? she is, she is. that's _ but soon she'll be doing right? she is, she is. that's the _ but soon she'll be doing right? she is, she is. that's the point - but soon she'll be doing right? she is, she is. that's the point of- is, she is. that's the point of this exercise to _ is, she is. that's the point of this exercise to see _ is, she is. that's the point of this exercise to see how _ is, she is. that's the point of this exercise to see how it _ is, she is. that's the point of this exercise to see how it develops. | is, she is. that's the point of this - exercise to see how it develops. and for me, is exercise to see how it develops. and for me. is tricky- _ exercise to see how it develops. and for me, is tricky. there _ exercise to see how it develops. and for me, is tricky. there will- exercise to see how it develops. and for me, is tricky. there will be - exercise to see how it develops. and for me, is tricky. there will be a - for me, is tricky. there will be a sti . ma for me, is tricky. there will be a stigma attached _ for me, is tricky. there will be a stigma attached to _ for me, is tricky. there will be a stigma attached to it _ for me, is tricky. there will be a stigma attached to it because i for me, is tricky. there will be a stigma attached to it because inj stigma attached to it because in you have to go because you will read the news at some point.— news at some point. that's going to be wanted at _ news at some point. that's going to be wanted at some _ news at some point. that's going to be wanted at some point. _ i want to go back our theme at the start of the programme, disinformation, and introduce you to the ways in which ai can be used in crisis management in real time to avert real—life disasters.
8:53 pm
back in 2019, news whip worked with the new transitional government in sudan that replaced president omar al bashir to monitor fake news disseminated by the government's opponents. here's a very good example of how dangerous misinformation can be and why the technology that tracks is so important in defeating it. through their online monitoring, valent saw ongoing attempts - by malicious actors to stake tensions between differentl factions within sudan - and actively monitored them. the intensity of these efforts kept escalating until one day, - discovered a facebook post shared across a network of rogue pages i suggesting that an armedl conflict was about to erupt in the capital city. the post was then followed up i by a video that claimed to show violent fighting about to break out. the video, actually an old clip from previous years' - conflict in khartoum, - was being used to try and manufacture conflict in the streets. the video was gaining traction based on volume and velocity—
8:54 pm
of engagement and interactions. they could 5ee that the post was predicted to go viral- in the hours ahead. i this was a genuine risk that could i trigger the sudanese army and other factions into mobilisation. sadly there was no stopping the civil war breaking out in sudan, but i hope that gives you some idea of how it works. amil khan is the founder and ceo of valent projects, that uses a new ai software called ariadne which helped investigations into last year us bank runs and short—selling by alerting what malign actors are up to and how to respond. welcome to the programme. how do you know that these are bots? how do you identify them?— identify them? well, we have a list of criteria we _ identify them? well, we have a list of criteria we are _ identify them? well, we have a list of criteria we are constantly - of criteria we are constantly updating _ of criteria we are constantly updating but _ of criteria we are constantly updating but there - of criteria we are constantly updating but there are - of criteria we are constantlyl updating but there are some of criteria we are constantly- updating but there are some quite obvious— updating but there are some quite obvious things _ updating but there are some quite obvious things such _ updating but there are some quite obvious things such as _ updating but there are some quite obvious things such as accounts i updating but there are some quite i obvious things such as accounts that post 250 _ obvious things such as accounts that post 250 times — obvious things such as accounts that post 250 times a _ obvious things such as accounts that post 250 times a day. _ obvious things such as accounts that post 250 times a day. we _ obvious things such as accounts that post 250 times a day. we have - obvious things such as accounts that post 250 times a day. we have all. post 250 times a day. we have all seen _ post 250 times a day. we have all seen accounts _ post 250 times a day. we have all seen accounts that _ post 250 times a day. we have all seen accounts that just _ post 250 times a day. we have all seen accounts that just have - post 250 times a day. we have all- 5een accounts that just have numbers seen accounts that just have numbers and letters _ seen accounts that just have numbers and letters so — seen accounts that just have numbers and letters. so we _ seen accounts that just have numbers and letters. so we take _ seen accounts that just have numbers and letters. so we take a _ seen accounts that just have numbers and letters. so we take a view- seen accounts that just have numbers and letters. so we take a view on - and letters. so we take a view on all of— and letters. so we take a view on all of those — and letters. so we take a view on all of those and _ and letters. so we take a view on all of those and give them - and letters. so we take a view on all of those and give them a -
8:55 pm
and letters. so we take a view on i all of those and give them a rating, and here _ all of those and give them a rating, and here actually— all of those and give them a rating, and here actually i— all of those and give them a rating, and here actually i think— all of those and give them a rating, and here actually i think you - all of those and give them a rating, and here actually i think you can i and here actually i think you can see some — and here actually i think you can see some of— and here actually i think you can see some of the _ and here actually i think you can see some of the day— and here actually i think you can see some of the day that - and here actually i think you can see some of the day that we - and here actually i think you can. see some of the day that we have and here actually i think you can - see some of the day that we have on a bank— see some of the day that we have on a bank run~ _ 5ee some of the day that we have on a bank run~ it's— see some of the day that we have on a bank run. it's in— see some of the day that we have on a bank run. it's in the _ see some of the day that we have on a bank run. it's in the bots _ see some of the day that we have on a bank run. it's in the bots there, - a bank run. it's in the bots there, the purple — a bank run. it's in the bots there, the purple accounts. _ a bank run. it's in the bots there, the purple accounts. what - a bank run. it's in the bots there, the purple accounts. what you i a bank run. it's in the bots there, i the purple accounts. what you are sitting _ the purple accounts. what you are sitting there — the purple accounts. what you are sitting there is _ the purple accounts. what you are sitting there is narrative _ the purple accounts. what you are sitting there is narrative testing i sitting there is narrative testing to those — 5itting there is narrative testing to those of— sitting there is narrative testing to those of the _ sitting there is narrative testing to those of the different - sitting there is narrative testing i to those of the different narratives that our— to those of the different narratives that our two — to those of the different narratives that our two will— to those of the different narratives that our two will show— to those of the different narratives that our two will show you. - that our two will show you. narratives _ that our two will show you. narratives that _ that our two will show you. narratives that give - that our two will show you. narratives that give some. that our two will show you. _ narratives that give some responses, narratives _ narratives that give some responses, narratives that — narratives that give some responses, narratives that are _ narratives that give some responses, narratives that are not _ narratives that give some responses, narratives that are not so _ narratives that are not so successful— narratives that are not so successful are _ narratives that are not so successful are either- narratives that are not so - successful are either tweaked or ditched — successful are either tweaked or ditched entirely. _ 5ucce55ful are either tweaked or ditched entirely. now— successful are either tweaked or ditched entirely. now that - successful are either tweaked ori ditched entirely. now that second activity _ ditched entirely. now that second activity there _ ditched entirely. now that second activity there you _ ditched entirely. now that second activity there you can _ ditched entirely. now that second activity there you can see - ditched entirely. now that second activity there you can see is - ditched entirely. now that second activity there you can see is whenj activity there you can see is when those _ activity there you can see is when those narratives _ activity there you can see is when those narratives are _ activity there you can see is when i those narratives are then deployed. so when _ those narratives are then deployed. so when you — those narratives are then deployed. so when you are _ those narratives are then deployed. so when you are at _ those narratives are then deployed. so when you are at seeing - those narratives are then deployed. so when you are at seeing all- those narratives are then deployed. so when you are at seeing all these j so when you are at seeing all these patterns with the bots can make any see patterns and narrative as well? absolutely. so i think people tend to think— absolutely. so i think people tend to think that— absolutely. so i think people tend to think that narratives _ absolutely. so i think people tend to think that narratives are - absolutely. so i think people tend to think that narratives are just i to think that narratives are just one thing — to think that narratives are just one thing that _ to think that narratives are just one thing that goes _ to think that narratives are just one thing that goes out - to think that narratives are just one thing that goes out and - to think that narratives are just j one thing that goes out and it's either— one thing that goes out and it's either true _ one thing that goes out and it's either true or— one thing that goes out and it's either true or false, _ one thing that goes out and it's either true or false, but - one thing that goes out and it's either true or false, but will. one thing that goes out and it'5| either true or false, but will use one thing that goes out and it'5 . either true or false, but will use a lot of— either true or false, but will use a lot of narrative _ either true or false, but will use a lot of narrative theory— either true or false, but will use a lot of narrative theory in - either true or false, but will use a lot of narrative theory in this. - either true or false, but will use a| lot of narrative theory in this. and that's— lot of narrative theory in this. and that's more — lot of narrative theory in this. and that's more of— lot of narrative theory in this. and that's more of a _ lot of narrative theory in this. and that's more of a sort— lot of narrative theory in this. and that's more of a sort of— that's more of a sort of psychological - that's more of a sort of psychological issue. i that's more of a sort of psychological issue. so that's more of a sort of. psychological issue. so for that's more of a sort of— psychological issue. so for example narrative _ psychological issue. so for example narrative theory, _ psychological issue. so for example narrative theory, people _ psychological issue. so for example narrative theory, people start -
8:56 pm
psychological issue. so for example narrative theory, people start with. narrative theory, people start with a problem — narrative theory, people start with a problem and _ narrative theory, people start with a problem and go _ narrative theory, people start with a problem and go on _ narrative theory, people start with a problem and go on to— narrative theory, people start with a problem and go on to a - narrative theory, people start with a problem and go on to a solution| narrative theory, people start with i a problem and go on to a solution to him as— a problem and go on to a solution to him as they— a problem and go on to a solution to him as they might— a problem and go on to a solution to him as they might say— a problem and go on to a solution to him as they might say for— a problem and go on to a solution to i him as they might say for example we all think— him as they might say for example we all think the _ him as they might say for example we all think the buses _ him as they might say for example we all think the buses run _ him as they might say for example we all think the buses run late and - all think the buses run late and it's terrible _ all think the buses run late and it's terrible and _ all think the buses run late and it's terrible and people - all think the buses run late and it's terrible and people might i all think the buses run late and - it's terrible and people might agree with that _ it's terrible and people might agree with that. then _ it'5 terrible and people might agree with that. then they— it'5 terrible and people might agree with that. then they will— it's terrible and people might agree with that. then they will get - it's terrible and people might agree with that. then they will get on - it's terrible and people might agree with that. then they will get on to i with that. then they will get on to burning _ with that. then they will get on to burning relate _ with that. then they will get on to burning relate from _ with that. then they will get on to burning relate from their- with that. then they will get on to i burning relate from their audience. sorry— burning relate from their audience. sorry to _ burning relate from their audience. sorry to rush— burning relate from their audience. sorry to rush you _ burning relate from their audience. sorry to rush you on _ burning relate from their audience. sorry to rush you on this _ burning relate from their audience. sorry to rush you on this because i burning relate from their audience. sorry to rush you on this because we have little time but are you basically doing elon musk was myjob for because you got rid of all the moderators and he is plotting all the bots. i moderators and he is plotting all the bots. ., �* ~' moderators and he is plotting all the bots. ., �* ,, ~ moderators and he is plotting all the bots. ., �* ,, . the bots. i don't think so. we look across a whole _ the bots. i don't think so. we look across a whole lot _ the bots. i don't think so. we look across a whole lot of _ the bots. i don't think so. we look across a whole lot of platforms - the bots. i don't think so. we look| across a whole lot of platforms and see all— across a whole lot of platforms and see all sorts — across a whole lot of platforms and see all sorts of _ across a whole lot of platforms and see all sorts of different _ across a whole lot of platforms and see all sorts of different types - see all sorts of different types of activity. — see all sorts of different types of activity. not _ see all sorts of different types of activity, not just _ see all sorts of different types of activity, not just dominant - see all sorts of different types of activity, notjust dominant going| activity, notjust dominant going to japan _ activity, notjust dominant going to japan again. — activity, notjust dominant going to japan again. are _ activity, notjust dominant going to japan again, are there _ activity, notjust dominant going to japan again, are there particular. japan again, are there particular nation _ japan again, are there particular nation states _ japan again, are there particular nation states are _ japan again, are there particular nation states are coming - japan again, are there particular nation states are coming up - japan again, are there particulari nation states are coming up when japan again, are there particular- nation states are coming up when you are spotting _ nation states are coming up when you are spotting the — nation states are coming up when you are spotting the bots? _ nation states are coming up when you are spotting the bots? nobody- nation states are coming up when you are spotting the bots? nobody will. are spotting the bots? nobody will be surprise — are spotting the bots? nobody will be surprise we _ are spotting the bots? nobody will be surprise we see _ are spotting the bots? nobody will be surprise we see a _ are spotting the bots? nobody will be surprise we see a lot _ are spotting the bots? nobody will be surprise we see a lot of- are spotting the bots? nobody willj be surprise we see a lot of chinese activity _ be surprise we see a lot of chinese activity and — be surprise we see a lot of chinese activity and russian _ be surprise we see a lot of chinese activity and russian activity - be surprise we see a lot of chinese activity and russian activity but. activity and russian activity but they are — activity and russian activity but they are by— activity and russian activity but they are by no _ activity and russian activity but they are by no means- activity and russian activity but they are by no means the - activity and russian activity but they are by no means the onlyl activity and russian activity but - they are by no means the only ones. good _ they are by no means the only ones. good to— they are by no means the only ones. good to have — they are by no means the only ones. good to have you _ they are by no means the only ones. good to have you on _ they are by no means the only ones. good to have you on the _ they are by no means the only ones. good to have you on the programme can with for keeping that nice and tight for us, we are out of time and want to thank rhea and sameer for being with me and also shilo, who is improving every week. big steps forward. back same time next week with al decoded and hope you join us
8:57 pm
for that. hello there. there were two different types of weather across the country for thursday, neither being warm. across england and wales, there was a lot of cloud around with some patches of rain and these grey skies, the rain, was all tied in with an area of low pressure out in the north sea. you can see the extent of the cloud here. further north, though, we had much brighter weather in scotland with some lengthy spells of sunshine. so if you wanted the sunnier weather, scotland was the place to be. however, we did see some of those brighter skies push in into northern ireland and northern england through the afternoon. now, overnight, we've got a few patches of rain to come and go across southern areas of england, perhaps south wales as well. away from that, most of the uk having clearing skies, especially during the second part of the night, becomes largely dry, and there will be quite a widespread frost in rural areas. so we are looking at a cold and a frosty start to the day for many on friday. but overall, it's a much brighter day with more in the way of sunshine. the exception southern england, where there is the threat of some
8:58 pm
rain, especially in the southwest. and through the afternoon we'll see some showers break out. they'll become quite widespread, but especially across northern and eastern scotland and eastern areas of england. wherever you are, we're looking at another chilly day for the time of year, with temperatures well around about 8—11t degrees. now, this weekend will see an area of low pressure move up from the south. and this brings with it the threat of some rain. now, on saturday, the rain will be affecting southern england across parts of wales, the midlands and east anglia. there will be a chilly wind gusting into the 25 miles an hour, not desperately strong, but given those low temperatures, the wind, i think, will make it feel that bit colder. for northern ireland, scotland and northern england, after a cold and frosty start, again, we're looking at an afternoon of sunny spells and passing showers, some of the showers having a bit of hail mixed in and temperatures below average once again. the second part of the weekend, the same area of low pressure threatens some rain across eastern england. now, there is a chance this rain could be a bit more extensive across the midlands and slower to clear, but away from that area,
8:59 pm
again, after a cold and locally frosty start, we're looking at some sunny spells and a number of showers, especially across the north and the west of the country. temperatures continue to run below average for the time of year. however, as we get into next week, if you're fed up with this chilly weather, it does look like we'll see something of a change to much milder weather conditions. however, it's not necessarily dry. there will be some rain and showers around next week.
9:00 pm
hello, i'm christian fraser. you're watching the context on bbc news. this is what you're asking us to say a president is entitled for total personal gain to use the trappings of his office without facing criminal liability. if you don't have immunity, you're not going to do anything. you're going to become a ceremonial president. even before the day began, he kind of held account of workers and basically his campaign saying, since they can't be out on the trail, they're going to bring the trail to them here in new york. they have a lot to sort through and muddle _ they have a lot to sort through and muddle through really, given their lack of— muddle through really, given their lack of case law they have to rely on. lack of case law they have to rely on. tonight on the panel — in washington, the democratic strategist hilary rosen
9:01 pm
and here in london lord peter ricketts, former british ambassador to france who also served

6 Views

info Stream Only

Uploaded by TV Archive on