Proud of you, Jadi! I grew up listening to your podcasts, and now, as an AI Scientist, it brings me joy to see that you’re publishing AI-related content in English. Way to go!
very nice! would love to see how to implement having multiple SLMs chat with each other... perhaps via model switching, so that not multiple SLMs are kept in memory all the time? should be fairly fast because of the models being small...
As always your content is top entertainment. Very relaxing and with a lot of really well explained tutorials. I have a few question... could you check the website for the LPIC 1 content... seems domain is not renewed or may be other issues but is offline. And do you plan to make LPIC 2 tutorials in english... would be great and i am sure lot of people will appreciate. Thanks.
So nice of you to take time and inform me. Renewed the domain. I have plans for LPIC2 but it wont be to soon. maybe in a few months. LPIC2 covers much more and very difficult to create a good course.
In general it is a confusing area. Because even the best of the best do not understand completely what is happening. We understand the NN algo but understanding exactly why the model produces this specific answer... is a mystery!
In general it is a confusing area. Because even the best of the best do not understand completely what is happening. We understand the NN algo but understanding exactly why the model produces this specific answer... is a mystery!
Proud of you, Jadi! I grew up listening to your podcasts, and now, as an AI Scientist, it brings me joy to see that you’re publishing AI-related content in English. Way to go!
Wow, thank you!
Yeahhh Jadi for the president 🤘🏽
Thanks this was really practical.
Thanks bro.greeting from berlin
No problem 👍
WOW I like your thick Persian accent it makes me proud
Thanks for this interesting video and introducing basics in simple words ❤️
Thankfully Jadi
very nice! would love to see how to implement having multiple SLMs chat with each other... perhaps via model switching, so that not multiple SLMs are kept in memory all the time? should be fairly fast because of the models being small...
Thanks 🙏
You’re welcome 😊
great one , tnx Jadi!
❤❤❤
I love this thanks jadi❤
VERY very GOOD video
As always your content is top entertainment. Very relaxing and with a lot of really well explained tutorials.
I have a few question... could you check the website for the LPIC 1 content... seems domain is not renewed or may be other issues but is offline.
And do you plan to make LPIC 2 tutorials in english... would be great and i am sure lot of people will appreciate.
Thanks.
So nice of you to take time and inform me. Renewed the domain. I have plans for LPIC2 but it wont be to soon. maybe in a few months. LPIC2 covers much more and very difficult to create a good course.
This topic still confuses me. thank you for time to describe it for all.
In general it is a confusing area. Because even the best of the best do not understand completely what is happening. We understand the NN algo but understanding exactly why the model produces this specific answer... is a mystery!
In general it is a confusing area. Because even the best of the best do not understand completely what is happening. We understand the NN algo but understanding exactly why the model produces this specific answer... is a mystery!
thnk u mr
its grt
Awesome
Hello jadi
I am weak in English but I trying see this video
You always told not danger😂❤
It's ok. You will improve step by step. Its not dangerous :D
❤