#glslviewer search results
For those doing #genuary1 there is GPGPU shader particle system example made with @threejs + @lygia_xyz + glsl-pipeline ported from #glslViewer on my sponsorware repo. Link 👇
Finally! Proper point cloud rendering, through #glslViewer custom engine in #b3d, of the data from my DIY 3D LiDAR! Big plus, live coding GLSL shaders on the side.
Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
Little victory: apparently the only way for Blender to respect per vertex normals is to create an entire new attribute set. Next step is for #glslViewer to check if the attribute exist and load that set
O'clock. Tech used: - #glsl #shaders - #glslViewer - #lygia #creativecoding #generativecoding #art #gradients #blue
More image to 3d, using MiDAS v3.1, #glslViewer and lygia.xyz
Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
Love how chroma ab makes this 3D reconstructions more oneiric, which hiding the technical limitations. (Made just with #glslViewer, #lygia and MiDAS v3.1)
For the upcoming @brtmoments Buenos Aires Collection, I have been creating new tools for my shader workflow. glsl-sandbox is the first one I opensource. It allows me to quick iterate on #glslViewer and then port the results to @threejs just by copy pasting the code. Thread 👇1/2
#shadercraft (№10) buff.ly/49KiLhN by @patriciogv (learning, inspiration) An excellent primer course to help you get the most out of #glslViewer and #lygia. Not only a great explainer for these tools, but a peek into an iconic creator's prototyping process 😍
integrating #glslViewer as a custom rendering engine on #Blender3d allows super easy camera animations #b3d
For my article on Monocular Depth Estimation, I made my own multi-model pipeline (I call it PRISMA), but I'm used @rerundotio and #glslViewer ( + @lygia_xyz ) for visualization. x.com/patriciogv/sta…
Little article comparing MiDAS, ZoeDepth, PatchFusion and Marigold models for monocular depth estimation. medium.com/@patriciogv/th…
PRISMA update: new bands for Depth_Anything (relative, not metric) and GmFlow, @Blender and #glslViewer project integration
Excited to share PRISMA, my computational photography pipeline for applying multiple models and inferences on any image or video github.com/patriciogonzal…
Got ask all the time about vertex animation #shaders, #glslViewer running on #blender3D it's the best workflow I have every made/had. You can go back and forth between editing your geometry, UVs & vertex color; and animating them on a #GLSL. Did I told you comes with #LYGIA?
LYGIA: on the horizon. Inspired by @marcinignac jump flooding and 2D GI related functions will start popping on @lygia_xyz soon. I have a working version for #glslViewer on my sponsorware repo github.com/sponsors/patri…
#shadercraft (№10) buff.ly/49KiLhN by @patriciogv (learning, inspiration) An excellent primer course to help you get the most out of #glslViewer and #lygia. Not only a great explainer for these tools, but a peek into an iconic creator's prototyping process 😍
PRISMA update: new bands for Depth_Anything (relative, not metric) and GmFlow, @Blender and #glslViewer project integration
Excited to share PRISMA, my computational photography pipeline for applying multiple models and inferences on any image or video github.com/patriciogonzal…
For my article on Monocular Depth Estimation, I made my own multi-model pipeline (I call it PRISMA), but I'm used @rerundotio and #glslViewer ( + @lygia_xyz ) for visualization. x.com/patriciogv/sta…
Little article comparing MiDAS, ZoeDepth, PatchFusion and Marigold models for monocular depth estimation. medium.com/@patriciogv/th…
For those doing #genuary1 there is GPGPU shader particle system example made with @threejs + @lygia_xyz + glsl-pipeline ported from #glslViewer on my sponsorware repo. Link 👇
LYGIA: on the horizon. Inspired by @marcinignac jump flooding and 2D GI related functions will start popping on @lygia_xyz soon. I have a working version for #glslViewer on my sponsorware repo github.com/sponsors/patri…
For the upcoming @brtmoments Buenos Aires Collection, I have been creating new tools for my shader workflow. glsl-sandbox is the first one I opensource. It allows me to quick iterate on #glslViewer and then port the results to @threejs just by copy pasting the code. Thread 👇1/2
It's made following the same specs as #glslViewer (github.com/patriciogonzal…) so you just need to copy&paste your code from glslViewer to glsl-sandbox. Yes, I'm that original with names.
github.com
GlslViewer DEFINES
Console-based GLSL Sandbox for 2D/3D shaders. Contribute to patriciogonzalezvivo/glslViewer development by creating an account on GitHub.
Got ask all the time about vertex animation #shaders, #glslViewer running on #blender3D it's the best workflow I have every made/had. You can go back and forth between editing your geometry, UVs & vertex color; and animating them on a #GLSL. Did I told you comes with #LYGIA?
integrating #glslViewer as a custom rendering engine on #Blender3d allows super easy camera animations #b3d
Finally! Proper point cloud rendering, through #glslViewer custom engine in #b3d, of the data from my DIY 3D LiDAR! Big plus, live coding GLSL shaders on the side.
Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
Little victory: apparently the only way for Blender to respect per vertex normals is to create an entire new attribute set. Next step is for #glslViewer to check if the attribute exist and load that set
Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
Love how chroma ab makes this 3D reconstructions more oneiric, which hiding the technical limitations. (Made just with #glslViewer, #lygia and MiDAS v3.1)
Generating the depth offline. The rest is real-time. #glslViewer plays both videos and makes the scene from that
All except the depth estimation. #GlslViewer is playing two videos
More image to 3d, using MiDAS v3.1, #glslViewer and lygia.xyz
Here an example using @Google’s Filament shaders running on #GlslViewer that computes the spherical harmonics and the shadowmap for you instagram.com/p/Boo5yFOgxcA/
O'clock. Tech used: - #glsl #shaders - #glslViewer - #lygia #creativecoding #generativecoding #art #gradients #blue
On other news #glslViewer now support 32bit depth buffers which means shadow! (Here running on a @Raspberry_Pi CM3 on the DevTerm by @Hal_clockwork)
One of the design constrains of #glslViewer is having strong back compatibility with embedded system like the @Raspberry_Pi. Running a OpenGL ES, glsl sandbox with no need for a X11/windows manager open the door for a lot of art projects. Photo @admsyn
Polishing an experiment: [ Obj Mesh -> BVH -> SDF -> slices -> PNG ] w github.com/patriciogonzal… -> RayMarching w #GlslViewer + #LYGIA. We shape the tools... : )
If you want to learn how to create shaders I recommend @bookofshaders. You can even create shaders in browser using #glslEditor: thebookofshaders.com/edit.php?log=2… In case you prefer desktop version you can use #glslViewer #unity3d #unitytips #programming #gamedev #indiedev
#GlslViewer's post-processing pass and buffers working! Wonder how it will be the integration with #b3d compositing
. #GlslViewer now generates shadowmaps for your models every time the light position or the model change location. Here a dramatic example github.com/patriciogonzal…
Update: adding autocomplete and improving suggestions for commands, returning lines where error occur and nice progress bar for PNG sequence export to #glslViewer. This will be a serious upgrade to your #glsl #shader coding experience on any IDE. Try it: github.com/patriciogonzal…
#blender images now are upload to #glslViewer as textures... need to figure how to make the uniform name obvious.
Made my own 3D LiDAR sensor and Blender engine based on #glslViewer and #LYGIA to render point clouds, took the gear outside and made some scans… And now you tell me #b3d can’t import normals from PLYs?! Wtf?!
Today in #AudioReactiveVisuals S01E04 we explore “The High Priestess” by @pixelspiritdeck in #glslViewer + #lygia ⏰ 21:00 CEST / GMT+2 🎇 twitch.tv/nerddisco
using @runwayml's depth estimation + inpainting exports it's possible to play any video as a 3D or volumetric content. Here an example playing at realtime using #glslViewer through a @LKGGlass display
#VERA & #GlslViewer update: support for OpenEXR images as floating point textures github.com/patriciogonzal…
Little victory: apparently the only way for Blender to respect per vertex normals is to create an entire new attribute set. Next step is for #glslViewer to check if the attribute exist and load that set
LYGIA update: Adding sprite and sprite loop animations functions. github.com/patriciogonzal… Tested on: #unity3d #P5 #glslViewer
There it go, that's the frag shader... I heavily use #includes so most of the code is abstracted. Passes in #glslViewer are with define(BUFFER_<number>) flags and each one writes to a u_buffer<number>.
Something went wrong.
Something went wrong.
United States Trends
- 1. Rodgers 16.6K posts
- 2. Steelers 36.1K posts
- 3. Chargers 21.9K posts
- 4. Schumer 188K posts
- 5. #HereWeGo 4,807 posts
- 6. #RHOP 6,049 posts
- 7. Herbert 9,467 posts
- 8. Tim Kaine 9,982 posts
- 9. Resign 88.5K posts
- 10. Jaylen Warren 1,608 posts
- 11. #ITWelcomeToDerry 3,092 posts
- 12. Durbin 15.1K posts
- 13. Cornyn 10.5K posts
- 14. Cade Cunningham 4,511 posts
- 15. #snfonnbc N/A
- 16. Ladd 2,894 posts
- 17. Arthur Smith N/A
- 18. #PITvsLAC 1,310 posts
- 19. Rams 29.4K posts
- 20. Shaheen 22.8K posts