#glslviewer search results

Video to 3D Memory (MiDAS v3.1 + #glslViewer + lygia.xyz)


Mi primer 'shader'... 🤘😳 #glsl #glslviewer


For those doing #genuary1 there is GPGPU shader particle system example made with @threejs + @lygia_xyz + glsl-pipeline ported from #glslViewer on my sponsorware repo. Link 👇


Finally! Proper point cloud rendering, through #glslViewer custom engine in #b3d, of the data from my DIY 3D LiDAR! Big plus, live coding GLSL shaders on the side.

Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!

patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!


Little victory: apparently the only way for Blender to respect per vertex normals is to create an entire new attribute set. Next step is for #glslViewer to check if the attribute exist and load that set

patriciogv's tweet image. Little victory: apparently the only way for Blender to respect per vertex normals is to create an entire new attribute set. Next step is for #glslViewer to check if the attribute exist and load that set

Now porting to #glslViewer for a code-centric workflow.


More image to 3d, using MiDAS v3.1, #glslViewer and lygia.xyz

Not bad real-time camera posses from a single image using MiDAS v3.1 + #glslViewer + #LYGIA



Having some fun with simple volumetric effects to this classic scene #glslViewer @lygia_xyz


Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!

patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!

Love how chroma ab makes this 3D reconstructions more oneiric, which hiding the technical limitations. (Made just with #glslViewer, #lygia and MiDAS v3.1)


For the upcoming @brtmoments Buenos Aires Collection, I have been creating new tools for my shader workflow. glsl-sandbox is the first one I opensource. It allows me to quick iterate on #glslViewer and then port the results to @threejs just by copy pasting the code. Thread 👇1/2


#shadercraft (№10) buff.ly/49KiLhN by @patriciogv (learning, inspiration) An excellent primer course to help you get the most out of #glslViewer and #lygia. Not only a great explainer for these tools, but a peek into an iconic creator's prototyping process 😍


integrating #glslViewer as a custom rendering engine on #Blender3d allows super easy camera animations #b3d


For my article on Monocular Depth Estimation, I made my own multi-model pipeline (I call it PRISMA), but I'm used @rerundotio and #glslViewer ( + @lygia_xyz ) for visualization. x.com/patriciogv/sta…

Little article comparing MiDAS, ZoeDepth, PatchFusion and Marigold models for monocular depth estimation. medium.com/@patriciogv/th…



PRISMA update: new bands for Depth_Anything (relative, not metric) and GmFlow, @Blender and #glslViewer project integration

patriciogv's tweet image. PRISMA update: new bands for Depth_Anything (relative, not metric) and GmFlow, @Blender and #glslViewer project integration
patriciogv's tweet image. PRISMA update: new bands for Depth_Anything (relative, not metric) and GmFlow, @Blender and #glslViewer project integration

Excited to share PRISMA, my computational photography pipeline for applying multiple models and inferences on any image or video github.com/patriciogonzal…



Got ask all the time about vertex animation #shaders, #glslViewer running on #blender3D it's the best workflow I have every made/had. You can go back and forth between editing your geometry, UVs & vertex color; and animating them on a #GLSL. Did I told you comes with #LYGIA?


LYGIA: on the horizon. Inspired by @marcinignac jump flooding and 2D GI related functions will start popping on @lygia_xyz soon. I have a working version for #glslViewer on my sponsorware repo github.com/sponsors/patri…

patriciogv's tweet image. LYGIA: on the horizon. Inspired by @marcinignac jump flooding and 2D GI related functions will start popping on @lygia_xyz soon. I have a working version for #glslViewer on my sponsorware repo github.com/sponsors/patri…

Welcome to the jumpflood club. Ported 2D GI lighting by @Yaazarai to WebGL and @nodes_io.



#shadercraft (№10) buff.ly/49KiLhN by @patriciogv (learning, inspiration) An excellent primer course to help you get the most out of #glslViewer and #lygia. Not only a great explainer for these tools, but a peek into an iconic creator's prototyping process 😍


PRISMA update: new bands for Depth_Anything (relative, not metric) and GmFlow, @Blender and #glslViewer project integration

patriciogv's tweet image. PRISMA update: new bands for Depth_Anything (relative, not metric) and GmFlow, @Blender and #glslViewer project integration
patriciogv's tweet image. PRISMA update: new bands for Depth_Anything (relative, not metric) and GmFlow, @Blender and #glslViewer project integration

Excited to share PRISMA, my computational photography pipeline for applying multiple models and inferences on any image or video github.com/patriciogonzal…



For my article on Monocular Depth Estimation, I made my own multi-model pipeline (I call it PRISMA), but I'm used @rerundotio and #glslViewer ( + @lygia_xyz ) for visualization. x.com/patriciogv/sta…

Little article comparing MiDAS, ZoeDepth, PatchFusion and Marigold models for monocular depth estimation. medium.com/@patriciogv/th…



For those doing #genuary1 there is GPGPU shader particle system example made with @threejs + @lygia_xyz + glsl-pipeline ported from #glslViewer on my sponsorware repo. Link 👇


LYGIA: on the horizon. Inspired by @marcinignac jump flooding and 2D GI related functions will start popping on @lygia_xyz soon. I have a working version for #glslViewer on my sponsorware repo github.com/sponsors/patri…

patriciogv's tweet image. LYGIA: on the horizon. Inspired by @marcinignac jump flooding and 2D GI related functions will start popping on @lygia_xyz soon. I have a working version for #glslViewer on my sponsorware repo github.com/sponsors/patri…

Welcome to the jumpflood club. Ported 2D GI lighting by @Yaazarai to WebGL and @nodes_io.



Having some fun with simple volumetric effects to this classic scene #glslViewer @lygia_xyz


For the upcoming @brtmoments Buenos Aires Collection, I have been creating new tools for my shader workflow. glsl-sandbox is the first one I opensource. It allows me to quick iterate on #glslViewer and then port the results to @threejs just by copy pasting the code. Thread 👇1/2


Now porting to #glslViewer for a code-centric workflow.


Got ask all the time about vertex animation #shaders, #glslViewer running on #blender3D it's the best workflow I have every made/had. You can go back and forth between editing your geometry, UVs & vertex color; and animating them on a #GLSL. Did I told you comes with #LYGIA?


integrating #glslViewer as a custom rendering engine on #Blender3d allows super easy camera animations #b3d


Finally! Proper point cloud rendering, through #glslViewer custom engine in #b3d, of the data from my DIY 3D LiDAR! Big plus, live coding GLSL shaders on the side.

Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!

patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!


Little victory: apparently the only way for Blender to respect per vertex normals is to create an entire new attribute set. Next step is for #glslViewer to check if the attribute exist and load that set

patriciogv's tweet image. Little victory: apparently the only way for Blender to respect per vertex normals is to create an entire new attribute set. Next step is for #glslViewer to check if the attribute exist and load that set

Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!

patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!

Love how chroma ab makes this 3D reconstructions more oneiric, which hiding the technical limitations. (Made just with #glslViewer, #lygia and MiDAS v3.1)


Generating the depth offline. The rest is real-time. #glslViewer plays both videos and makes the scene from that


All except the depth estimation. #GlslViewer is playing two videos


Video to 3D Memory (MiDAS v3.1 + #glslViewer + lygia.xyz)


More image to 3d, using MiDAS v3.1, #glslViewer and lygia.xyz

Not bad real-time camera posses from a single image using MiDAS v3.1 + #glslViewer + #LYGIA



Adding plot,fps and plot,ms so you can keep a close eye on performance on #glslViewer


Here an example using @Google’s Filament shaders running on #GlslViewer that computes the spherical harmonics and the shadowmap for you instagram.com/p/Boo5yFOgxcA/

patriciogv's tweet image. Here an example using @Google’s Filament shaders running on #GlslViewer that computes the spherical harmonics and the shadowmap for you instagram.com/p/Boo5yFOgxcA/

On other news #glslViewer now support 32bit depth buffers which means shadow! (Here running on a @Raspberry_Pi CM3 on the DevTerm by @Hal_clockwork)

patriciogv's tweet image. On other news #glslViewer now support 32bit depth buffers which means shadow! (Here running on a @Raspberry_Pi CM3 on the DevTerm by @Hal_clockwork)

One of the design constrains of #glslViewer is having strong back compatibility with embedded system like the @Raspberry_Pi. Running a OpenGL ES, glsl sandbox with no need for a X11/windows manager open the door for a lot of art projects. Photo @admsyn

patriciogv's tweet image. One of the design constrains of #glslViewer is having strong back compatibility with embedded system like the @Raspberry_Pi. Running a OpenGL ES, glsl sandbox with no need for a X11/windows manager open the door for a lot of art projects. Photo @admsyn

Polishing an experiment: [ Obj Mesh -> BVH -> SDF -> slices -> PNG ] w github.com/patriciogonzal… -> RayMarching w #GlslViewer + #LYGIA. We shape the tools... : )

patriciogv's tweet image. Polishing an experiment: [ Obj Mesh -> BVH -> SDF -> slices -> PNG ] w github.com/patriciogonzal… -> RayMarching w #GlslViewer + #LYGIA. We shape the tools... : )

If you want to learn how to create shaders I recommend @bookofshaders. You can even create shaders in browser using #glslEditor: thebookofshaders.com/edit.php?log=2… In case you prefer desktop version you can use #glslViewer #unity3d #unitytips #programming #gamedev #indiedev


#GlslViewer's post-processing pass and buffers working! Wonder how it will be the integration with #b3d compositing

patriciogv's tweet image. #GlslViewer's post-processing pass and buffers working! Wonder how it will be the integration with #b3d compositing

. #GlslViewer now generates shadowmaps for your models every time the light position or the model change location. Here a dramatic example github.com/patriciogonzal…

patriciogv's tweet image. . #GlslViewer now generates shadowmaps for your models every time the light position or the model change location. Here a dramatic example github.com/patriciogonzal…
patriciogv's tweet image. . #GlslViewer now generates shadowmaps for your models every time the light position or the model change location. Here a dramatic example github.com/patriciogonzal…

Update: adding autocomplete and improving suggestions for commands, returning lines where error occur and nice progress bar for PNG sequence export to #glslViewer. This will be a serious upgrade to your #glsl #shader coding experience on any IDE. Try it: github.com/patriciogonzal…


#blender images now are upload to #glslViewer as textures... need to figure how to make the uniform name obvious.

patriciogv's tweet image. #blender images now are upload to #glslViewer as textures... need to figure how to make the uniform name obvious.

Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!

patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
patriciogv's tweet image. Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!

Today in #AudioReactiveVisuals S01E04 we explore “The High Priestess” by @pixelspiritdeck in #glslViewer + #lygia ⏰ 21:00 CEST / GMT+2 🎇 twitch.tv/nerddisco

NERDDISCO's tweet image. Today in #AudioReactiveVisuals S01E04 we explore “The High Priestess” by @pixelspiritdeck in #glslViewer + #lygia

⏰ 21:00 CEST / GMT+2
🎇 twitch.tv/nerddisco

Improving #GlslViewer command console dramatically by bringing ncurses in

patriciogv's tweet image. Improving #GlslViewer command console dramatically by bringing ncurses in

using @runwayml's depth estimation + inpainting exports it's possible to play any video as a 3D or volumetric content. Here an example playing at realtime using #glslViewer through a @LKGGlass display


#VERA & #GlslViewer update: support for OpenEXR images as floating point textures github.com/patriciogonzal…

patriciogv's tweet image. #VERA & #GlslViewer update: support for OpenEXR images as floating point textures github.com/patriciogonzal…

Little victory: apparently the only way for Blender to respect per vertex normals is to create an entire new attribute set. Next step is for #glslViewer to check if the attribute exist and load that set

patriciogv's tweet image. Little victory: apparently the only way for Blender to respect per vertex normals is to create an entire new attribute set. Next step is for #glslViewer to check if the attribute exist and load that set

LYGIA update: Adding sprite and sprite loop animations functions. github.com/patriciogonzal… Tested on: #unity3d #P5 #glslViewer

patriciogv's tweet image. LYGIA update: Adding sprite and sprite loop animations functions. github.com/patriciogonzal… Tested on: #unity3d #P5 #glslViewer
patriciogv's tweet image. LYGIA update: Adding sprite and sprite loop animations functions. github.com/patriciogonzal… Tested on: #unity3d #P5 #glslViewer
patriciogv's tweet image. LYGIA update: Adding sprite and sprite loop animations functions. github.com/patriciogonzal… Tested on: #unity3d #P5 #glslViewer
patriciogv's tweet image. LYGIA update: Adding sprite and sprite loop animations functions. github.com/patriciogonzal… Tested on: #unity3d #P5 #glslViewer

There it go, that's the frag shader... I heavily use #includes so most of the code is abstracted. Passes in #glslViewer are with define(BUFFER_<number>) flags and each one writes to a u_buffer<number>.

patriciogv's tweet image. There it go, that&apos;s the frag shader... I heavily use #includes so most of the code is abstracted. Passes in #glslViewer are with define(BUFFER_&amp;lt;number&amp;gt;) flags and each one writes to a u_buffer&amp;lt;number&amp;gt;.

Loading...

Something went wrong.


Something went wrong.


United States Trends