Other gifts on the list also included a feather crown from Amazonian Indigenous leaders, two gold Blue Peter badges and a tea set from Smythson of Bond Street.
Created by Bristol artist Alison Larkman, Mirrorbox plays messages from ME and long Covid patients explaining why a particular location is special to them, and why their condition means they cannot be there themselves.Ms Larkman, who has ME, said the concept came from "the idea of taking up space, of being seen and heard but also being invisible at the same time".
The initiative, titled 'I would be here if I could', has seen the Mirrorbox travel all over the country and it will be in Bristol and Glastonbury in the coming weeks.ME causes extreme tiredness and can be so severe that patients are left bed-bound and unable to complete even simple tasks.Other symptoms include problems with memory and concentration, muscle and joint pain, dizziness and sensitivity to light and sound.
When her illness is at its worst, Ms Larkman can only stay awake for three-hour windows."Your imagination is huge and you can lay in bed and travel to all sorts of places and think about things whereas you can't do them," she said.
"That's one of the questions I would think - 'where would I be if I could?'"
For Ms Larkman, the answer is always watching the hustle and bustle of London's Victoria Station from the top of the escalators on the way to visit her sister, but for others it was as simple as being able to see their children on the swings at the park."We are in a strange position of building these extremely complex things, where we don't have a good theory of exactly how they achieve the remarkable things they are achieving," he says. "So having a better understanding of how they work will enable us to steer them in the direction we want and to ensure that they are safe."
The prevailing view in the tech sector is that LLMs are not currently conscious in the way we experience the world, and probably not in any way at all. But that is something that the married couple Profs Lenore and Manuel Blum, both emeritus professors at Carnegie Mellon University in Pittsburgh, Pennsylvania, believe will change, possibly quite soon.According to the Blums, that could happen as AI and LLMs have more live sensory inputs from the real world, such as vision and touch, by connecting cameras and haptic sensors (related to touch) to AI systems. They are developing a computer model that constructs its own internal language called Brainish to enable this additional sensory data to be processed, attempting to replicate the processes that go on in the brain.
"We think Brainish can solve the problem of consciousness as we know it," Lenore tells the BBC. "AI consciousness is inevitable."Manuel chips in enthusiastically with an impish grin, saying that the new systems that he too firmly believes will emerge will be the "next stage in humanity's evolution".