#lossfunctions search results

Fine-tuning with LoRA and #LossFunctions #LoRA (Low-Rank Adaptation) introduces parameter-efficient learning by injecting low-rank updates into pre-trained models, reducing computational costs without sacrificing performance. LoRA Training Command:

premai_io's tweet image. Fine-tuning with LoRA and #LossFunctions

#LoRA (Low-Rank Adaptation) introduces parameter-efficient learning by injecting low-rank updates into pre-trained models, reducing computational costs without sacrificing performance.

LoRA Training Command:

Day 19: Losses in focus: MAE, MSE, Huber, Log Loss. Different goals, different penalties. Know when to use which. #MLfromScratch #LossFunctions

dataneuron's tweet image. Day 19:
Losses in focus:
 MAE, MSE, Huber, Log Loss.
Different goals, different penalties.
Know when to use which.
#MLfromScratch #LossFunctions

D7 & 8 - challenging 4 hours of #100DaysOfCode. Choosing #lossfunctions and optimisers were topics of today's session. And yep, ran my first log #regression on #TensorFlow on the MNIST dataset - my gitbook is starting to look good 😊 #girlswhocode

ilieva_snezhana's tweet image. D7 & 8 - challenging 4 hours of #100DaysOfCode. Choosing #lossfunctions and optimisers were topics of today's session. And yep, ran my first log #regression on #TensorFlow on the MNIST dataset - my gitbook is starting to look good 😊 #girlswhocode
ilieva_snezhana's tweet image. D7 & 8 - challenging 4 hours of #100DaysOfCode. Choosing #lossfunctions and optimisers were topics of today's session. And yep, ran my first log #regression on #TensorFlow on the MNIST dataset - my gitbook is starting to look good 😊 #girlswhocode

Covering #word2vec, #softmax, & #lossfunctions including #crossentropy loss in @dcaragea’s #DeepLearning course, & discussing #autoencoders. (@ Engineering Hall (Durland Phase IV) in Manhattan, KS) swarmapp.com/c/2yIPLppYu8N

banazir's tweet image. Covering #word2vec, #softmax, & #lossfunctions including #crossentropy loss in @dcaragea’s #DeepLearning course, & discussing #autoencoders. (@ Engineering Hall (Durland Phase IV) in Manhattan, KS) swarmapp.com/c/2yIPLppYu8N

If you are interested in getting the granular information of an image, then you have to use slightly more advanced #lossfunctions. We look at a couple of them in this article on image segmentation: bit.ly/3bnTjkp #MachineLearning #DataScience #DeepLearning

neptune_ai's tweet image. If you are interested in getting the granular information of an image, then you have to use slightly more advanced #lossfunctions.

We look at a couple of them in this article on image segmentation:
bit.ly/3bnTjkp

#MachineLearning #DataScience #DeepLearning

Hier seht ihr die wichtigsten Verlustfunktionen der Klassifikation aufgelistet. In den nächsten Beiträgen werden wir euch diese genauer vorstellen. #loss #lossfunctions #verlustfunktionen #klassifikation #classification #informatik #machinelearningtraining #machinelearning

TeachhoodAI's tweet image. Hier seht ihr die wichtigsten Verlustfunktionen der Klassifikation aufgelistet.

In den nächsten Beiträgen werden wir euch diese genauer vorstellen.
#loss #lossfunctions #verlustfunktionen #klassifikation #classification #informatik #machinelearningtraining #machinelearning

Day 20: Losses in focus: -> Hinge Loss -> Categorical Cross-Entropy Margins vs. probabilities. SVMs vs. Softmax. Each loss, its own battlefield. Choose your weapon wisely. #MLfromScratch #LossFunctions


Loss Functions in AI measure the inconsistency between predicted value and actual value. It's a method of evaluating how well a specific algorithm models the given data #LossFunctions


A well-chosen loss function leads to better learning! #MachineLearning #AI #LossFunctions #DeepLearning #DataScience


Thinking about starting a deep learning project? Don't make the mistake of using the wrong loss function! My latest blog breaks down the different options and helps you choose the best one for your needs. #DeepLearning #LossFunctions #AI #MachineLearning medium.com/@roushanakrahm…


🧠 Loss functions = the compass guiding deep learning! From MSE to cross-entropy, choosing the right one tunes your model for peak performance. Minimize error, maximize impact! 🔗linkedin.com/in/octogenex/r…, instagram.com/ds_with_octoge… #LossFunctions #DeepLearning #AI365 #NeuralNetworks

octogenex's tweet image. 🧠 Loss functions = the compass guiding deep learning!
From MSE to cross-entropy, choosing the right one tunes your model for peak performance.
Minimize error, maximize impact!
🔗linkedin.com/in/octogenex/r…, instagram.com/ds_with_octoge…
#LossFunctions #DeepLearning #AI365 #NeuralNetworks

Day 20: Losses in focus: -> Hinge Loss -> Categorical Cross-Entropy Margins vs. probabilities. SVMs vs. Softmax. Each loss, its own battlefield. Choose your weapon wisely. #MLfromScratch #LossFunctions


Day 19: Losses in focus: MAE, MSE, Huber, Log Loss. Different goals, different penalties. Know when to use which. #MLfromScratch #LossFunctions

dataneuron's tweet image. Day 19:
Losses in focus:
 MAE, MSE, Huber, Log Loss.
Different goals, different penalties.
Know when to use which.
#MLfromScratch #LossFunctions

A well-chosen loss function leads to better learning! #MachineLearning #AI #LossFunctions #DeepLearning #DataScience


'Localisation of Regularised and Multiview Support Vector Machine Learning', by Aurelian Gheondea, Cankat Tilki. jmlr.org/papers/v25/23-… #lossfunctions #kernels #kernel


Fine-tuning with LoRA and #LossFunctions #LoRA (Low-Rank Adaptation) introduces parameter-efficient learning by injecting low-rank updates into pre-trained models, reducing computational costs without sacrificing performance. LoRA Training Command:

premai_io's tweet image. Fine-tuning with LoRA and #LossFunctions

#LoRA (Low-Rank Adaptation) introduces parameter-efficient learning by injecting low-rank updates into pre-trained models, reducing computational costs without sacrificing performance.

LoRA Training Command:

🧠 Tomorrow, we'll venture into another critical topic in deep learning. Stay curious! #LossFunctions #NeuralNetworks #DeepLearningFundamentals


Loss functions measure how well a machine learning model fits the data and guide its learning process. Choosing the right loss function is crucial for achieving good performance. Learn more at geeksguide.net/98-2/ #MachineLearning #lossfunctions #DataScience


Thinking about starting a deep learning project? Don't make the mistake of using the wrong loss function! My latest blog breaks down the different options and helps you choose the best one for your needs. #DeepLearning #LossFunctions #AI #MachineLearning medium.com/@roushanakrahm…


Accuracy v. Iterations .-. #lossfunctions

_smileyball's tweet image. Accuracy v. Iterations .-. #lossfunctions

@karpathy Had an interesting #LossFunctions today. Thought I'd share.

ChaseAucoin's tweet image. @karpathy Had an interesting #LossFunctions today. Thought I'd share.

Fine-tuning with LoRA and #LossFunctions #LoRA (Low-Rank Adaptation) introduces parameter-efficient learning by injecting low-rank updates into pre-trained models, reducing computational costs without sacrificing performance. LoRA Training Command:

premai_io's tweet image. Fine-tuning with LoRA and #LossFunctions

#LoRA (Low-Rank Adaptation) introduces parameter-efficient learning by injecting low-rank updates into pre-trained models, reducing computational costs without sacrificing performance.

LoRA Training Command:

D7 & 8 - challenging 4 hours of #100DaysOfCode. Choosing #lossfunctions and optimisers were topics of today's session. And yep, ran my first log #regression on #TensorFlow on the MNIST dataset - my gitbook is starting to look good 😊 #girlswhocode

ilieva_snezhana's tweet image. D7 & 8 - challenging 4 hours of #100DaysOfCode. Choosing #lossfunctions and optimisers were topics of today's session. And yep, ran my first log #regression on #TensorFlow on the MNIST dataset - my gitbook is starting to look good 😊 #girlswhocode
ilieva_snezhana's tweet image. D7 & 8 - challenging 4 hours of #100DaysOfCode. Choosing #lossfunctions and optimisers were topics of today's session. And yep, ran my first log #regression on #TensorFlow on the MNIST dataset - my gitbook is starting to look good 😊 #girlswhocode

Day 19: Losses in focus: MAE, MSE, Huber, Log Loss. Different goals, different penalties. Know when to use which. #MLfromScratch #LossFunctions

dataneuron's tweet image. Day 19:
Losses in focus:
 MAE, MSE, Huber, Log Loss.
Different goals, different penalties.
Know when to use which.
#MLfromScratch #LossFunctions

Hier seht ihr die wichtigsten Verlustfunktionen der Klassifikation aufgelistet. In den nächsten Beiträgen werden wir euch diese genauer vorstellen. #loss #lossfunctions #verlustfunktionen #klassifikation #classification #informatik #machinelearningtraining #machinelearning

TeachhoodAI's tweet image. Hier seht ihr die wichtigsten Verlustfunktionen der Klassifikation aufgelistet.

In den nächsten Beiträgen werden wir euch diese genauer vorstellen.
#loss #lossfunctions #verlustfunktionen #klassifikation #classification #informatik #machinelearningtraining #machinelearning

🧠 Loss functions = the compass guiding deep learning! From MSE to cross-entropy, choosing the right one tunes your model for peak performance. Minimize error, maximize impact! 🔗linkedin.com/in/octogenex/r…, instagram.com/ds_with_octoge… #LossFunctions #DeepLearning #AI365 #NeuralNetworks

octogenex's tweet image. 🧠 Loss functions = the compass guiding deep learning!
From MSE to cross-entropy, choosing the right one tunes your model for peak performance.
Minimize error, maximize impact!
🔗linkedin.com/in/octogenex/r…, instagram.com/ds_with_octoge…
#LossFunctions #DeepLearning #AI365 #NeuralNetworks

If you are interested in getting the granular information of an image, then you have to use slightly more advanced #lossfunctions. We look at a couple of them in this article on image segmentation: bit.ly/3bnTjkp #MachineLearning #DataScience #DeepLearning

neptune_ai's tweet image. If you are interested in getting the granular information of an image, then you have to use slightly more advanced #lossfunctions.

We look at a couple of them in this article on image segmentation:
bit.ly/3bnTjkp

#MachineLearning #DataScience #DeepLearning

Covering #word2vec, #softmax, & #lossfunctions including #crossentropy loss in @dcaragea’s #DeepLearning course, & discussing #autoencoders. (@ Engineering Hall (Durland Phase IV) in Manhattan, KS) swarmapp.com/c/2yIPLppYu8N

banazir's tweet image. Covering #word2vec, #softmax, & #lossfunctions including #crossentropy loss in @dcaragea’s #DeepLearning course, & discussing #autoencoders. (@ Engineering Hall (Durland Phase IV) in Manhattan, KS) swarmapp.com/c/2yIPLppYu8N

Loading...

Something went wrong.


Something went wrong.


United States Trends