The trek to tech central for an annual meet-up of coders isnât all hoodies, snacks and all-night gaming sessions.
Blind software engineer Ed Summers recently shared the main stage with other accessibility experts at the annual GitHub Universe summit in San Francisco.
âWhen you build, please build without barriers,â he urged the elite audience.
âYou collectively create the technologies that are used by all of humanity and if you unknowingly introduce barriers into those technologies then those barriers will undoubtedly create disabilities for some people.â
Artificial intelligence is already being used by some people â including coders â who are blind or have low vision, are non-verbal or have difficulty speaking, or people with cerebral palsy and other conditions that affect movement.
Eye-tracking software allows people with vision but no movement to interact with a computer by using their eyes.
For example Becky Tyler, who was born with quadriplegic cerebral palsy, has helped develop software designed specifically to help people with disabilities play Minecraft with their eyes.
Mr Summers, head of accessibility at code-hosting platform GitHub, said there should be ânothing about us without usâ.
GitHub is a platform used by more than 100 million open-source software developers that allows them to share and fine-tune their work â including 1.4 million coders in Australia.
âWe do not want people making decisions for us or on our behalf,â he told AAP.
âWe are capable of being included in the process and contributing ourselves, and technology is not optional.â
The World Bank coined the phrase âdisability divideâ, which affects the 1.3 billion people globally with disabilities including about one in five Australians.
âAs a group we experience lower outcomes across many aspects of our lives, including health, education and employment,â Mr Summers said.
He said everyone must have access to technology and digital information, and called for diversity in those creating new technology.
âAs opposed to 20-something, white male, North American people creating the technology,â he explained.
âIf we can expand that out to be more global and more inclusive along many dimensions, including disability, then it greatly enhances the chance that everybody can benefit from this human progress,â he said.
âThatâs a big driver for us, and for me personally as well.â
Mr Summers said he thinks about accessibility with an upper-case âAâ and a lower-case âaâ.
âUpper-case Aâ accessibility is about removing barriers for people with disability and ensuring eye-tracking software or his screen-reader or phone works as best as it can.
âIn the industry we have a pretty good understanding of what that is and some standards around how to maximise compatibility with assistive technologies,â he said.
âFor people who need to customise the user interface and minimum colour thresholds â thatâs a lot of what accessibility work is, from one perspective.â
And then thereâs âlower-case aâ accessibility, which is about making technology more approachable and usable for a wider number of people.
âWhatâs happening right now on the platform and more broadly with generative AI is simply amazing,â he said.
âItâs going to yield some serious benefits.â
The so-called natural language aspect of generative AI, where machines can understand and respond to text and voice data, is transformative â not for everyone, but for many people, he said.
Natural language is how most people communicate with each other and express themselves.
Usually, learning how to code involves cryptic syntax and semi-colons that makes sense for a machine and a narrow group of humans.
âBut if we can express what we want and build things by articulating our vision, thatâs a game-changer, thatâs lower-case a accessibility,â Mr Summers said.
There is another aspect that he says he experiences personally, and that is anxiety.
âSo itâs very poignant for me,â he said.
He hasnât written code on a daily basis for 15 years, because heâs been managing or building and running projects.
âIâm rusty, so when I have to do new things I have anxiety about that and frustration because I canât just rip things out like I used to when I was coding all the time,â he explained.
âAnd I donât want to ask the brilliant engineers because I donât want them to know.â
He said he can resolve that anxiety by asking in chat format what to do next, particularly about something that he should already know how to do.
âIf I had to go ask one of my co-workers I would be very embarrassed.â
Darryl Adams, director of accessibility at Intel, said tools such as mobile phone app Be My Eyes provided a feeling of connection that he didnât know he was missing.
âIâm visually impaired and I have a pretty difficult time with many visual tasks, including seeing the details of images on my phone,â he said.
âIâve been using Be My AI to describe my images for me. The results are remarkable ⌠a series of wow moments over and over, and it just kind of feels like magic.â
Be My Eyes founder Hans Jorgen Wiberg said the AI assistant Be My AI, powered by GPT-4, is the latest function of the app and means people are no longer reliant on another personâs description.
The function is available for iPhone users and began to be rolled out for Android phones in December to describe a whiteboard, read a menu, or help navigate a street and âseeâ the outdoors.
âYou take a photo and this photo is automatically uploaded to open AI and you will get a detailed description,â the inventor said.
âItâs super important that you actually develop with the people you are developing for,â he said.
âThe app has definitely improved from the feedback we have gotten directly from our users.â
Mr Adams said apps were important for people with disability to be connected with the world, and remain connected, particularly when suffering from conditions that mean they are âlocked inâ with no movement or speech.
Intelâs ACAT, or assistive context aware toolkit, was originally developed during years of collaboration with Professor Stephen Hawking, the late world-leading physicist and cosmologist with motor neuron disease.
The open-source software enables communication through keyboard simulation, word prediction and speech synthesis, including accessing emails, editing documents and using the internet.
âThe general idea is to be able to get access to all kinds of computing functions through a single switch,â Mr Adams said.
âWe can trigger that digital switch by using spacial gestures detected by a camera, or proximity sensors, or even just off-the-shelf mechanical buttons and switches.â
He said a new version of ACAT includes a brain computer interface that can interpret brain signals and communicate basic needs, which adds another mode for people who are experiencing advanced disease progression.
âWe also want to give ACAT ears,â he said.
âWe want the system to be able to listen to the conversation and provide responses or response suggestions in real time.â
That would reduce the âsilence gapâ that usually occurs, he explained.
But tech entrepreneur Joe Devon sounded a note of caution about brain computer interface technology as it begins to touch on the ability to read thoughts.
âWe have to pay attention and draw a line in the sand,â he said.
âThere should be some regulation around it in order to make sure data is private unless we agree to share it.â
Â
Marion Rae
(Australian Associated Press)