LabInteract's profile picture. This is the inactive Twitter account of a research lab at Sussex University. The lab has now moved and this account is not active anymore.

LabInteract

@LabInteract

This is the inactive Twitter account of a research lab at Sussex University. The lab has now moved and this account is not active anymore.

Pinned

We are excited to be exhibiting Particle-Based-Display #ultrasound #levitation technologies on the 11.2020 at the #Design Capital of The World Exhibition In co-operation with #Shanghai Academy of Fine Arts @ShanghaiEye

LabInteract's tweet image. We are excited to be exhibiting Particle-Based-Display #ultrasound #levitation technologies on the 11.2020 at the #Design Capital of The World Exhibition In co-operation with #Shanghai Academy of Fine Arts @ShanghaiEye
LabInteract's tweet image. We are excited to be exhibiting Particle-Based-Display #ultrasound #levitation technologies on the 11.2020 at the #Design Capital of The World Exhibition In co-operation with #Shanghai Academy of Fine Arts @ShanghaiEye
LabInteract's tweet image. We are excited to be exhibiting Particle-Based-Display #ultrasound #levitation technologies on the 11.2020 at the #Design Capital of The World Exhibition In co-operation with #Shanghai Academy of Fine Arts @ShanghaiEye
LabInteract's tweet image. We are excited to be exhibiting Particle-Based-Display #ultrasound #levitation technologies on the 11.2020 at the #Design Capital of The World Exhibition In co-operation with #Shanghai Academy of Fine Arts @ShanghaiEye

LabInteract reposted

Everything ready for showcasing our work in #SoftRobotics #bioinspiration #haptics for bringing people closer and growing together at the #DiscoveryHub - Grow #BloomsburyFestival2023!

Sariadela_abad's tweet image. Everything ready for showcasing our  work in #SoftRobotics #bioinspiration #haptics for bringing people closer and growing together at the #DiscoveryHub - Grow #BloomsburyFestival2023!

Get ready for a thrilling showcase of cutting-edge research by @SoftHaptics and @uclrobotics at this year's #BloomsburyFestival2023! Don't miss out on the future of technology! @uclmecheng @UCLEngineering @UCLEastEngage



LabInteract reposted

The deadline for these positions is August 16th; please share with job seekers in Human-Centered Computing. Thx

Do you want to work on HCI in Copenhagen? We look for assistant professors that focus on technologies, including AR, UBICOMP, tangible, IUI, VR, fabrication, mobile. Deadline August 16th, details at bit.ly/2YfK0OC. Please RT and encourage folks to get in touch 🇩🇰🏛️🎉📚🙏



Using #NeuralNetworks and #GeneticAlgorithms, #Interactlab researchers are exploring how to use #AcousticMetamaterials to #cloak an object from an incoming ultrasound wave. See how the “sound shadow” is reduced by the strategically placed blocks in the second image?

LabInteract's tweet image. Using #NeuralNetworks and #GeneticAlgorithms, #Interactlab researchers are exploring how to use #AcousticMetamaterials to #cloak an object from an incoming ultrasound wave. See how the “sound shadow” is reduced by the strategically placed blocks in the second image?

With real time path generation and image projection, our #ParticleBasedDisplays could be used to generate a #hologram of each participant in a conference call, for the next level of #Remote #Interaction

LabInteract's tweet image. With real time path generation and image projection, our #ParticleBasedDisplays could be used to generate a #hologram of each participant in a conference call, for the next level of #Remote #Interaction

From our virtual reading group: #visual distortion is common in museums or events, but what about #auditory distortion? Here, we see how #training users can improve sound localization for error reduction to spot distorted #spatial sound sources. tinyurl.com/yajw52sw

LabInteract's tweet image. From our virtual reading group: #visual distortion is common in museums or events, but what about #auditory distortion? Here, we see how #training users can improve sound localization for error reduction to spot distorted #spatial sound sources. tinyurl.com/yajw52sw
LabInteract's tweet image. From our virtual reading group: #visual distortion is common in museums or events, but what about #auditory distortion? Here, we see how #training users can improve sound localization for error reduction to spot distorted #spatial sound sources. tinyurl.com/yajw52sw

#Schlieren optics can help to visualise invisible air pressure differences, including sound! Here is a neat little project on background oriented schlieren imaging used on a candle: calebkruse.com/10-projects/sc…

LabInteract's tweet image. #Schlieren optics can help to visualise invisible air pressure differences, including sound! Here is a neat little project on background oriented schlieren imaging used on a candle:
calebkruse.com/10-projects/sc…

LabInteract reposted

Looking forward to a new exciting chapter of multisensory experience and interface research at @uclic together with @sssram and Diego Martinez Plasencia uclic.ucl.ac.uk/news-events-se…


LabInteract reposted

Really excited to announce my move to @uclic with @obristmarianna and Diego Martinez Plasencia to form one of the largest HCI research group in the world 😀. uclic.ucl.ac.uk/news-events-se…


In a spooky haunted house attraction, a visitor feels a ghostly hand touch his shoulder. This remote #hapticfeedback is being provided by #Interactlab Phased Array Transducers boards.

LabInteract's tweet image. In a spooky haunted house attraction, a visitor feels a ghostly hand touch his shoulder. This remote #hapticfeedback is being provided by #Interactlab Phased Array Transducers boards.

LabInteract reposted

This coming tuesday, there’ll be an online @sig_chi Smell, Taste, & Temperature symposium (stt20.plopes.org) with a very exciting line-up! It’s entirely free and open to the public 💐🍦❄️ The symposium is brought to you by the HCIntegration Lab at @uchicagocs @uchicago 1/5


TableHop an interactive tabletop display made from stretchable spandex and actuated using transparent electrodes. Visuals are projected beneath the fabric and the electrodes provide #HapticFeedback to the fingertips of the users @acmsigchi #CHI2016 tinyurl.com/ybxubs88

LabInteract's tweet image. TableHop an interactive tabletop display made from stretchable spandex and actuated using transparent electrodes. Visuals are projected beneath the fabric and the electrodes provide #HapticFeedback to the fingertips of the users @acmsigchi #CHI2016 tinyurl.com/ybxubs88

LabInteract reposted

We also strongly encourage proposals that align with SIGCHI's commitment towards addressing racial and other systemic injustices. medium.com/sigchi/a-time-… #BLM


LabInteract reposted

Good read read on the purpose and role of Venture Capitalists from GP of Concentric - Kjartan Risk forbes.com/sites/kjartanr…


Check out the newest publication in Advanced Materials Technologies. We use individually tuned unit cells to show focusing of ultrasonic waves at different points in space as well as generation of sonic hologram images.tinyurl.com/y8x9e6qy

LabInteract's tweet image. Check out the newest publication in Advanced Materials Technologies. We use individually tuned unit cells to show focusing of ultrasonic waves at different points in space as well as generation of sonic hologram images.tinyurl.com/y8x9e6qy

From our virtual reading group: “super-oscillation” is a concept whereby #wave-packets can temporarily oscillate at frequencies much higher than their largest frequency component and thus resolve & manipulate subwavelength areas of space @NatureComms tinyurl.com/y7cao3x8

LabInteract's tweet image. From our virtual reading group: “super-oscillation” is a concept whereby #wave-packets can temporarily oscillate at frequencies much higher than their largest frequency component and thus resolve & manipulate subwavelength areas of space @NatureComms

tinyurl.com/y7cao3x8

Before diving into the excellent BEM/FEM software for acoustic simulations like #BEMpp #Interactlab researchers have been building their own scripts for discovering the capabilities of specially designed #acoustic #metamaterials by outputting numerically determined sound fields.

LabInteract's tweet image. Before diving into the excellent BEM/FEM software for acoustic simulations like #BEMpp #Interactlab researchers have been building their own scripts for discovering the capabilities of specially designed #acoustic #metamaterials by outputting numerically determined sound fields.

From our #virtual reading group: State-of-the-art #LiDAR technology and a new #algorithm enable #3D reconstructions as far as 320m away in real-time! The algorithm can even #reconstruct targets behind semi-opaque surfaces like a #camouflage net. tiny.cc/o4oaqz

LabInteract's tweet image. From our #virtual reading group: State-of-the-art #LiDAR technology and a new #algorithm enable #3D reconstructions as far as 320m away in real-time! The algorithm can even #reconstruct targets behind semi-opaque surfaces like a #camouflage net. tiny.cc/o4oaqz
LabInteract's tweet image. From our #virtual reading group: State-of-the-art #LiDAR technology and a new #algorithm enable #3D reconstructions as far as 320m away in real-time! The algorithm can even #reconstruct targets behind semi-opaque surfaces like a #camouflage net. tiny.cc/o4oaqz

LabInteract reposted

Many thanks to all those interested in our ongoing SMART funding opportunity – we will be able to confirm timings for the next round of applications next week.

innovateuk's tweet image. Many thanks to all those interested in our ongoing SMART funding opportunity – we will be able to confirm timings for the next round of applications next week.

Loading...

Something went wrong.


Something went wrong.