ICP has Superior Culture

Sdílet
Vložit
  • čas přidán 12. 06. 2024
  • Culture is what defines a country and determines their social standing in the world. Similar to countries, each blockchain has its own culture, and this culture is what will determine their faith in the technical landscape of tomorrow. In this episode, I provide an example of a stellar individual in our community. It is individuals like him that will make Internet Computer the best blockchain network in the world.
    Instagram: JamesAllan0
    Patreon: / cityxcape
  • Věda a technologie

Komentáře • 50

  • @Damien-Infinity
    @Damien-Infinity Před 21 dnem +12

    ICPis the best❤❤❤👍💯

  • @bryank1880
    @bryank1880 Před 21 dnem +6

    Thanks for the video James and thanks everyone for the insightful comments!

    • @cxc
      @cxc  Před 21 dnem +1

      You're one of us Bryan

  • @beachbumLifestyle
    @beachbumLifestyle Před 21 dnem +13

    I’m proud to announce I’m now a icp whale just did it 😅

    • @cxc
      @cxc  Před 21 dnem +1

      That's great news... Congratulations. I'll share your whaleness to the community in the next Q&A

    • @sjawahar2001
      @sjawahar2001 Před 17 dny +1

      How many ICP u hold?

    • @cxc
      @cxc  Před 16 dny

      @@sjawahar2001 He holds like 12,000

  • @ZeroFuxBE
    @ZeroFuxBE Před 21 dnem +7

    The Dfinity developer forum is great, I'm sure they would be able to answer such questions.

    • @bryank1880
      @bryank1880 Před 21 dnem +2

      Agreed I will try my luck there

    • @cxc
      @cxc  Před 21 dnem +1

      Let me know if you find anything good there Bryan

  • @ln2deep
    @ln2deep Před 21 dnem +9

    You couldn't currently run an LLM on ICP. You need 28GB of RAM to run a 7 billion parametres model which is on the smaller side of models that are used in industry. It'll fit on a consumer grade GPU. Something like ChatGPT needs very specialised hardware which is beyond anyone but major companies. The bigger problem is that you really need to run it on a GPU to make it run efficiently so that the calculations run in parallel. Moreover, the calculations in the neural network are highly optimised to run on GPUs which probably wouldn't work on a CPU. There are CPU optimised models too but I struggle to imagine deploying one for use by consumers. It'd take up all of your CPU memory/compute to just run that and it'd be quite slow. They have talked about adding GPU support to nodes though. The AI demo they showed was still very impressive because it shows the power of their nodes.

    • @cxc
      @cxc  Před 21 dnem +4

      My goodness! What a sharp answer. Do you mind if I share with the community? Also what is your background? The community might want to know that as well.

    • @ln2deep
      @ln2deep Před 21 dnem +2

      @@cxc sure no problem. I’m an AI researcher that works with LLMs. I’m sure I could do better background checks on running LLMs on CPUs but the above is my initial/gut response.

    • @JGR1517
      @JGR1517 Před 21 dnem +1

      ​@ln2deep
      Given your synopsis, do you still see a strong future for ICP? I'm no tech expert, but everything I've researched and watched on ICP/Dfinity is that this is s slam dunk long term investment.

    • @ln2deep
      @ln2deep Před 20 dny

      ​@@JGR1517 It's definitely the most complex tech out there and it has a great team. Most of the 'work' on AI at the moment in industry is related to legal aspects of AI systems and compliance and the blockchain could help with that in one way or another.
      My main AI play is Render because of its very straight forward value proposition and it has extremely impressive collaborations. Having said that, it just tackles the horse power part of the equation. Render tokens are basically GPU fuel while ICP tokens are compute cycles. But ICP is also a 'world computer' which the compute cycles run on. It is mind-blowingly more complex.
      My main question with ICP is if it's really worth moving Web 2 apps onto Web 3. It might not always be worth it, but if ICP corners the Web 3 market then that is huge. I don't see why the majority of websites wouldn't run on ICP. It may well come down to compliance and cost. Blockchain is great for compliance issues but the question is whether it can be more cost effective. It's my biggest bag so I sure hope so.

  • @elchingon3546
    @elchingon3546 Před 21 dnem +3

    ICP ❤❤❤!!!!

  • @Stef_ICP
    @Stef_ICP Před 21 dnem +5

    ❤ icp

  • @Camillo123
    @Camillo123 Před 21 dnem +3

    Right now it’s not possible only with cloud infrastructure. But I think it will work in the future.

    • @cxc
      @cxc  Před 21 dnem +1

      I see

  • @DJR000
    @DJR000 Před 21 dnem +2

    Here are some general guidelines:
    - Model size: 1-10 GB (e.g., BERT-base is around 350 MB, while larger models like RoBERTa-large can be up to 5 GB)
    - Memory per node: 16-64 GB (depending on the instance type, e.g., ml.p4d.24xlarge has 64 GB)
    - Total memory: 128-512 GB (or more) for distributed training with multiple nodes
    Keep in mind that these are rough estimates, and the actual memory requirements may vary depending on the specific use case and model configuration.

  • @dennishanna3022
    @dennishanna3022 Před 21 dnem +4

    Great content Bro!

    • @cxc
      @cxc  Před 21 dnem +1

      Glad you think so!

  • @ArsenedeBienne
    @ArsenedeBienne Před 21 dnem +4

    great video 💪

    • @cxc
      @cxc  Před 21 dnem +2

      Appreciate it!

  • @farukTheProBro
    @farukTheProBro Před 20 dny +1

    thanks for the video😊

    • @cxc
      @cxc  Před 20 dny +1

      You're always welcomed here

  • @ericthorpe4990
    @ericthorpe4990 Před 20 dny

    Legend 🤙🏻👍🏻🤙🏻

  • @thiagoamaralf
    @thiagoamaralf Před 21 dnem

    Hi James, really cool that situation you have encountered, I am not qualified to fully answer the question but as I recently started studying about A.I , I could understand that the process of collecting data and processing it can be very complex, when you are taking about imagery which needs to be analyzed by the A.I tool in a small scale project the datasets from the LLM and CNNs(Convolutional Neural Network) in a small scale it could reach 1TB easily. So my recommendation and generally what a lot of people do is they develop and train the tool once they have the data set and the tools is starting to work the way they need you can move the project to another environment this approach is much easier then create speculations without know how much it will cost, etc.. however I personally would create this project at home and gather the resources a need to do so later once the project have mature enough then I would make a decision where I would be hosting it, unless you have really good budget etc... prior to do the LLM and CNN data set project is very difficult to calculate cost but if you start to do that in your own environment it will give you a good idea then you can start to speculate more and refine your conclusions. if you intend to train the model in the blockchain we need to remember that the processing power would cost even more.

    • @thiagoamaralf
      @thiagoamaralf Před 21 dnem

      However I also remember now that I watched a video from someone in the Dfinity team mentioning about A.I as far as I could understand the ICP network is not there yet to support A.I big datasets etc... but I might be wrong... cheers!

  • @CarlosRamirez-yk6wv
    @CarlosRamirez-yk6wv Před 21 dnem +1

    Is not possible to train LLM’s yet, it’s on the new roadmap. Just inference

  • @axiomdao
    @axiomdao Před 20 dny +1

    I am an anarcho-capitalist and hope that you are correct about ICP being a libertarian culture. I am also building on the IC and will test that theory.

    • @cxc
      @cxc  Před 20 dny +1

      Nice to see a fellow anarcho-capitalist. I also view myself as a classical liberal, meaning that I believe in the social contract

  • @ScouseWill79
    @ScouseWill79 Před 18 dny +1

    @cxc I take you back to Dominic Williams' first AI on-chain video, he states that (at the time) they were running 32-bit which allows for 4GB storage, that they were moving to 64-bit which would allow up to 32 GB storage and leading to LLM's, i'm not sure if that means the 64-bit system would actually allow LLM's or if he meant in future if they moved to 128-bit

    • @cxc
      @cxc  Před 17 dny

      An AI researcher has answered the question which I'll share with the community in the next town hall, but according him even the upgrade you talk about would not be sufficient. These LLMs are monsters

  • @zach8151
    @zach8151 Před 20 dny

    From my exp with akt it is possible and the hardware requirements are dense, but stability seems to be one of the current choke points for running LLM via web3, which would bring a hospital lower on the list of appropriate early adopters as system stability is key in healthcare. One of the difficulties in decentralized rental GPU for model training is that the person renting it out to you can decide they need their GPUs back...I've seen customers find work-arounds by using models with parallel algos involving other GPUs of similar density...think 4 sets of H100 providers and if one goes down the algo automatically connects to the second best choice. It would be cool to see ICP testing SLM first with less vital services to explore capabilities

  • @arahant69
    @arahant69 Před 21 dnem +2

    The 4gb will double or quadruple soon, I think if I heard eight from Dominick on dfinity.

  • @richdreamzbeats998
    @richdreamzbeats998 Před 21 dnem

    So if nodes are not using GPUs as of now then I wonder will Icp move towards nvidia for there gpus if so nvidia is absolutely killing it. Also they just did a 10/1 split nvidia is going crazy

  • @beachbumLifestyle
    @beachbumLifestyle Před 21 dnem +3

    I own more then you now fingers crossed 😂 that I’m good staked 8 year gang

  • @sjawahar2001
    @sjawahar2001 Před 17 dny +1

    How to become whale in ICP?

    • @cxc
      @cxc  Před 16 dny +1

      You gotta own 10,000 or more

  • @MiljanRistanovic
    @MiljanRistanovic Před 21 dnem +1

    Everything great, top and then price goes down (: Something else is problem, and nobody knows. Lost more then 40% of value in two months. I’m not sure for happy end.

  • @ZeroFuxBE
    @ZeroFuxBE Před 21 dnem +4

    Gangculture

  • @B0N4N24
    @B0N4N24 Před 21 dnem +1

    Nice talk about libertarianism. Which movement within libertarianism do you adhere to?
    I run LLMs locally on my desktop and this requires a CPU or GPU. GPUs are much faster in this context and without a GPU the results take painfully slow time. Dominic's demos have shown that it's currently possible to run LLMs on ICP, but among other things, GPUs still need to be added to the network to run the LLMs fast enough and I think this is planned.
    In this context, an interview with the Elna AI team may be interesting: czcams.com/video/LFj3CtqBfXE/video.html. It may be more efficient for him to continue to build on their project.

    • @bryank1880
      @bryank1880 Před 21 dnem

      Thanks for the insight

    • @cxc
      @cxc  Před 21 dnem

      Thanks for the lovely answer. I'll be sure to check out the interview you suggested from Blockchain Pill, it seems good.
      In regards to the libertarian question, I'm honestly not sure. I think I'm what people call a classical liberal, meaning I believe in the social contract. What about you?