#tensorflow search results

Want to build your first #TensorFlow model, but not sure where to start? In this tutorial, you’ll: → Load and explore a dataset → Build and train your model → See what actually improves accuracy ▶️ Watch the full video by @iuliaferoli: youtu.be/nswGrvOhaOY


Built an IMDB sentiment classifier using Embedding + SimpleRNN in TensorFlow 🎬🧠 ~98.8% train acc, ~80–85% val acc (observed overfitting). Next: Dropout, EarlyStopping & LSTM 🚀 Code:github.com/SayliThukral/D… #DeepLearning #NLP #TensorFlow #Python

SayliThukral's tweet image. Built an IMDB sentiment classifier using Embedding + SimpleRNN in TensorFlow 🎬🧠
~98.8% train acc, ~80–85% val acc (observed overfitting).

Next: Dropout, EarlyStopping & LSTM 🚀

Code:github.com/SayliThukral/D…

#DeepLearning #NLP #TensorFlow #Python
SayliThukral's tweet image. Built an IMDB sentiment classifier using Embedding + SimpleRNN in TensorFlow 🎬🧠
~98.8% train acc, ~80–85% val acc (observed overfitting).

Next: Dropout, EarlyStopping & LSTM 🚀

Code:github.com/SayliThukral/D…

#DeepLearning #NLP #TensorFlow #Python

Training your first #TensorFlow model isn’t really about the model – it’s about what you learn from it. A simple example: Train two models on the same data. Model A: ~88% accuracy Model B: Slightly better, but ~50% longer training That’s your first real ML trade-off: is a small

pycharm's tweet image. Training your first #TensorFlow model isn’t really about the model – it’s about what you learn from it.

A simple example: Train two models on the same data.
Model A: ~88% accuracy
Model B: Slightly better, but ~50% longer training

That’s your first real ML trade-off: is a small

Built an IMDB sentiment classifier using Embedding + SimpleRNN in TensorFlow 🎬🧠 ~98.8% train acc, ~80–85% val acc (observed overfitting). Next: Dropout, EarlyStopping & LSTM 🚀 Code:github.com/SayliThukral/D… #DeepLearning #NLP #TensorFlow #Python

SayliThukral's tweet image. Built an IMDB sentiment classifier using Embedding + SimpleRNN in TensorFlow 🎬🧠
~98.8% train acc, ~80–85% val acc (observed overfitting).

Next: Dropout, EarlyStopping & LSTM 🚀

Code:github.com/SayliThukral/D…

#DeepLearning #NLP #TensorFlow #Python
SayliThukral's tweet image. Built an IMDB sentiment classifier using Embedding + SimpleRNN in TensorFlow 🎬🧠
~98.8% train acc, ~80–85% val acc (observed overfitting).

Next: Dropout, EarlyStopping & LSTM 🚀

Code:github.com/SayliThukral/D…

#DeepLearning #NLP #TensorFlow #Python

Just trained a Neural Network on MNIST using TensorFlow + Keras 🧠 Flatten → Dense(128, ReLU) → Dense(10, Softmax) ✅ Normalized data ✅ Adam optimizer ✅ Model saved & reloaded Full notebook 👇 github.com/victorjanni/fa… #MachineLearning #Python #TensorFlow


Training your first #TensorFlow model isn’t really about the model – it’s about what you learn from it. A simple example: Train two models on the same data. Model A: ~88% accuracy Model B: Slightly better, but ~50% longer training That’s your first real ML trade-off: is a small

pycharm's tweet image. Training your first #TensorFlow model isn’t really about the model – it’s about what you learn from it.

A simple example: Train two models on the same data.
Model A: ~88% accuracy
Model B: Slightly better, but ~50% longer training

That’s your first real ML trade-off: is a small

隠し要素)セルフィーモード インカメラに映った人のシルエットが、 影となってサイトに登場します。 影の映画が始まりしばらくすると出てくる 謎の物体をクリックするとスタートです。 スマホでも可能ですが、 PCでの体験をおすすめします。 #WebGL #Threejs #TensorFlow

mount_inc's tweet image. 隠し要素)セルフィーモード

インカメラに映った人のシルエットが、
影となってサイトに登場します。

影の映画が始まりしばらくすると出てくる
謎の物体をクリックするとスタートです。

スマホでも可能ですが、
PCでの体験をおすすめします。

#WebGL #Threejs #TensorFlow

高崎卓馬さん( @takumantakuman )と 矢花宏太さんが設立した会社、 WRITING & DESIGN のサイトを 企画・制作しました。 ものづくりへのお二人の実直な姿勢を そのまま体現するサイトを目指しました。 wd-inc.jp CD, AD, De:イム ジョンホ @junim PL, AD, TD:岡部 健二 @kenjiokabe .



Behold Jaxley: differentiable simulator for biophysical neuron models, written in the Python library #JAX, because we needed something more than #tensorflow. Imagine a sweet RNN models with Hodgkin–Huxley-type neurons 🧠 nature.com/articles/s4159… #neuroAI

Dr_Alex_Crimi's tweet image. Behold Jaxley: differentiable simulator for biophysical neuron models, written in the Python library #JAX, because we needed something more than #tensorflow. 
Imagine a sweet RNN models with Hodgkin–Huxley-type neurons 🧠
nature.com/articles/s4159…
#neuroAI

Loading...

Something went wrong.


Something went wrong.