The AI Hacker
The AI Hacker
  • Видео 6
  • Просмотров 2 667 624
I Built a Personal Speech Recognition System for my AI Assistant
This video shows you how to build your own real time speech recognition system with Python and PyTorch. It walks you through the deep learning techniques that are effective when modeling speech problems, as well as code to build your own.
⭐ Play and Experiment With the Latest AI Technologies at grandline.ai ⭐
This video is the second episode of the series "How to build your own A.I. voice assistant with Pytorch"
ruclips.net/p/PL5rWfvZIL-NpFXM9nFr15RmEEh4F4ePZW
Github:
github.com/LearnedVector/A-Hackers-AI-Voice-Assistant
Pre-Trained ASR Model:
drive.google.com/file/d/1jcNOI3jb4GkixA_wuNCIGz-Qjc9OmdxH/view?usp=sharing
Просмотров: 262 974

Видео

Build your own Deep learning Machine - What you need to know
Просмотров 216 тыс.4 года назад
This video talks about what you need to know when sourcing parts to build your own deep learning machine similar Lambda Labs Workstation. What type of CPU do you need? or What GPU is good for your use case. How much RAM? This video breaks it all down. Outline: 00:01:30 - GPU 00:03:59 - CPU 00:05:38 - RAM 00:06:21 - Motherboard 00:06:55 - Storage 00:07:44 - Power Supply 00:08:19 - Cooling 00:08:...
I Built an A.I. Voice Assistant using PyTorch - part 1, Wake Word Detection
Просмотров 430 тыс.4 года назад
This is a series where I walk through the engineering steps and challenges on how to build an Artificial intelligence voice assistant, similar to google home or Amazon Alexa, with Python and PyTorch on a Raspberry Pi. I leverage the latest machine and deep learning techniques to achieve this. In this video, I show how you can build a wake word detector (keyword spotting) using recurrent neural ...
Illustrated Guide to Transformers Neural Network: A step by step explanation
Просмотров 950 тыс.4 года назад
Transformers are the rage nowadays, but how do they work? This video demystifies the novel neural network architecture with step by step explanation and illustrations on how transformers work. CORRECTIONS: The sine and cosine functions are actually applied to the embedding dimensions and time steps! ⭐ Play and Experiment With the Latest AI Technologies at grandline.ai ⭐ Hugging Face Write with ...
Illustrated Guide to LSTM's and GRU's: A step by step explanation
Просмотров 493 тыс.5 лет назад
LSTM's and GRU's are widely used in state of the art deep learning models. For those just getting into machine learning and deep learning, this is a guide in plain English with helpful visuals to help you grok LSTM's and GRU's. Subscribe to receive video updates on practical Artificial Intelligence and it's applications. Also, comment below and let me know what'd you like to see next! ⭐ Play an...
Illustrated Guide to Recurrent Neural Networks: Understanding the Intuition
Просмотров 317 тыс.5 лет назад
If you enjoy this, check out my other content at www.michaelphi.com Recurrent Neural Networks are an extremely powerful machine learning technique but they may be a little hard to grasp at first. For those just getting into machine learning and deep learning, this is a guide in plain English with helpful visuals to help you grok RNN's. Subscribe to receive video updates on practical Artificial ...

Комментарии

  • @BufferOverflow_RSA
    @BufferOverflow_RSA 22 часа назад

    Goated!!!!

  • @CatchMeIFYouCan-251
    @CatchMeIFYouCan-251 4 дня назад

    Man Amazing Visualization and great explainaition

  • @LynnLe-ky6rf
    @LynnLe-ky6rf 5 дней назад

    Beautifully explained! :)

  • @StephenSpivack
    @StephenSpivack 9 дней назад

    best video on the topic

  • @Johan-rm6ec
    @Johan-rm6ec 11 дней назад

    This video is like a bone without meat.

  • @MattGeo4754
    @MattGeo4754 13 дней назад

    Micheal Reeves' good twin.

  • @sherlyw-k5g
    @sherlyw-k5g 20 дней назад

    An extremely great video! Thanks a lot! A little question: I wonder whether the python codes of GRU is similar to LSTM or not.

  • @mikheilbeldishevski2776
    @mikheilbeldishevski2776 21 день назад

    Best explanation.Thanks

  • @quantaozhu8670
    @quantaozhu8670 22 дня назад

    Thank you for the excellent explanation. I'm not sure if there's something wrong. At 6:28, I think this is cross product, not dot product, and query's shape is [3,4], key's shape is [4, 3], so the shape of Scores should be [3, 3], not [4, 4].

  • @arnaudhamida5360
    @arnaudhamida5360 27 дней назад

    how can we reproduce your UI for my voice assistant i love it

  • @litttlemooncream5049
    @litttlemooncream5049 Месяц назад

    very kind for you to make these videos! thanks a lot but gave up at 4:06...

  • @RayRay-yt5pe
    @RayRay-yt5pe Месяц назад

    Odd thing is, It actually understood it better using code. Nice one thanks!

  • @arthurcaleb6853
    @arthurcaleb6853 Месяц назад

    Great Video!

  • @object814
    @object814 Месяц назад

    Thank you, very informative and clear, one of the most intuitive ones seen on YT, keep up the great job!

  • @grandson_f_phixis9480
    @grandson_f_phixis9480 Месяц назад

    Thank you very much

  • @tsclly2377
    @tsclly2377 Месяц назад

    Did you burn up your SSDs? I did when ETH mining.. SLC is the only way to go and that means Optane (pedabytewrites), as the PCIe linked NVMe for the data dump (write) as it is as fast as the PCIe 4 bandwith (4x) and the lanes are really only 8x as 16x is 2 8x lanes

  • @zix2421
    @zix2421 Месяц назад

    Thank you, LSTM is really useful thing, I’ll use it

  • @archlunarwolf
    @archlunarwolf Месяц назад

    He sticks it to Amazon and then goes ahead and orders the SD card from Amazon...

  • @RoonyKingXL
    @RoonyKingXL Месяц назад

    6:30 - Am I being stupid or is the visualization wrong? The scores matrix should be 3x3 not 4x4, right? Please correct me if I'm wrong, I feel like I'm missing something.

  • @Mojo522
    @Mojo522 Месяц назад

    Thank you!

  • @prabhdeepsingh8726
    @prabhdeepsingh8726 Месяц назад

    This was great. I have seen countless videos on RNNs and LSTMs and nobody explained it by taking a simple example like you did. It was a perfect balance of theory with application.

  • @hamidrezahosseinkhani5980
    @hamidrezahosseinkhani5980 Месяц назад

    That was incredible! thanks!

  • @cocoph
    @cocoph Месяц назад

    This is the best explanation of transformers models, please keep going on this channel. There are lots of models still need to explain!

  • @BrianCarter
    @BrianCarter 2 месяца назад

    For a neuroatypical, your background music is too distracting, it would be nice if there wasn’t any

  • @Boom-em1os
    @Boom-em1os 2 месяца назад

    thank you

  • @markthomas2436
    @markthomas2436 2 месяца назад

    You did a fine job.

  • @phoenix1799
    @phoenix1799 2 месяца назад

    Bro I use a setup with 128GB RAM with RTX 4080 16GB, RTX 3060 OC 12GB, RTX 2060 super 8GB on it with 5TB SSD M2 but I use as a open AIR setup for faster cooling. But you cabinet setup looks very efficient and cool. Could you send me the link for it

  • @tabindahayat3492
    @tabindahayat3492 2 месяца назад

    Woah! Exquisite, It's a 15 min video but I spent over an hour taking notes and understanding. You have done a great job, keep it up. Thank you so much! Such explanations are rare. ;)

  • @alexdaniel76
    @alexdaniel76 2 месяца назад

    Cool! 👏 Thank you for the video! What about the OS? Maybe Linux? If Linux, which one?

  • @anamariatiradogonzalez
    @anamariatiradogonzalez 2 месяца назад

    Una estructura árbol de la vida. Kabakah

  • @anthonybernstein1626
    @anthonybernstein1626 2 месяца назад

    12:56 isn’t that the other way around (i.e. they queries come from the previous layer and the keys and values from the encoder’s output)?

  • @guilhermealvessilveira8938
    @guilhermealvessilveira8938 2 месяца назад

    Excellent

  • @timothyweakly2496
    @timothyweakly2496 2 месяца назад

    I would like to build one but I'm way ignorant on coding and building.

  • @ashishbhong5901
    @ashishbhong5901 2 месяца назад

    it was not just help full but amazing, loved it.

  • @AravindUkrd
    @AravindUkrd 2 месяца назад

    Please create more videos. You are really good.

  • @Sabumnim666
    @Sabumnim666 2 месяца назад

    For a guy who did a "lot" of research and you want mores cores why not get a thread ripper.

  • @coolStranger516
    @coolStranger516 2 месяца назад

    thanks, bro. well explained.

  • @sweatyninja9755
    @sweatyninja9755 2 месяца назад

    How do i fine tune something?

  • @ayanah4821
    @ayanah4821 2 месяца назад

    😮

  • @martinsenuy895
    @martinsenuy895 2 месяца назад

    Hi, super explanatory and easy to follow video! Do you have any updates? Maybe using "cheap" AMDs like 6700xt? haha

  • @walloouu
    @walloouu 2 месяца назад

    i'm in love <3

  • @aneekeshkumar8199
    @aneekeshkumar8199 2 месяца назад

    The audio kept buggin me, I'd heard it somewhere, then I remembered the Iconic Outros of the Channel Veritasium !!!!!

  • @sahhaf1234
    @sahhaf1234 3 месяца назад

    in these figures, where are the weights?

  • @josep1429
    @josep1429 3 месяца назад

    🎯 Key Takeaways for quick navigation: 00:00 *Los transformers están revolucionando el procesamiento del lenguaje natural, superando a modelos anteriores como las redes neuronales recurrentes.* 02:29 *Los transformers introducen una arquitectura basada en atención, permitiendo un acceso potencialmente ilimitado al contexto durante la generación de texto.* 05:04 *La atención múltiple es un módulo clave en los transformers, que permite que cada palabra se relacione con otras en la secuencia de entrada.* 09:24 *La capa codificadora de los transformers utiliza la atención para crear una representación continua de la información de entrada.* 11:01 *Durante la decodificación, se aplica un enmascaramiento para prevenir que el modelo acceda a tokens futuros, asegurando una generación autoregresiva coherente.* 14:39 *Los transformers, al superar las limitaciones de la memoria a corto plazo, son especialmente eficaces para codificar y generar secuencias largas en el procesamiento del lenguaje natural.* Made with HARPA AI

  • @yashgajjar4838
    @yashgajjar4838 3 месяца назад

    Thank you so much! Very well Explained, cleared most of the doubts.

  • @chriz__3656
    @chriz__3656 3 месяца назад

    is it possible to build this on raspberry pi 3 plezzz reply 😇

  • @hussainbhavnagarwala2596
    @hussainbhavnagarwala2596 3 месяца назад

    Can we use CNN instead of RNN here for the classification of MFCC images?

  • @user-wm8hy8ce2o
    @user-wm8hy8ce2o 3 месяца назад

    bro you made this video before gpt 3 and all the new era of LLMs !!