How To Install Fabric - Open-Source AI Framework That Can Automate Your Life
Vložit
- čas přidán 19. 05. 2024
- Fabric is described as an open-source framework for augmenting humans using AI. They have community-driven prompts that are tried and true and work well, and an easy command line interface with a UI coming soon.
Join My Newsletter for Regular AI Updates 👇🏼
www.matthewberman.com
Need AI Consulting? ✅
forwardfuture.ai/
My Links 🔗
👉🏻 Subscribe: / @matthew_berman
👉🏻 Twitter: / matthewberman
👉🏻 Discord: / discord
👉🏻 Patreon: / matthewberman
Rent a GPU (MassedCompute) 🚀
bit.ly/matthew-berman-youtube
USE CODE "MatthewBerman" for 50% discount
Media/Sponsorship Inquiries 📈
bit.ly/44TC45V
Links:
github.com/danielmiessler/fabric
Chapters:
0:00 - About Fabric
1:06 - Installation
3:43 - How to Use - Věda a technologie
Thank you for keeping me on the cutting edge. I have been trying to cobble something together with CrewAI, but this framework is intriguing and I'm installing it now!
Already has crewAI integration 🎉
This is the best and most practical/useful AI chennels of all. Thanks again!
Interesting, I'm not sure what exactly it's doing, but what we really need is premade(tested and working) functions that work with LLMs to process data, such as:
Retrieve information from the internet on topic x, then process, summarize, etc, so many result pages.
Process a pdf/docs, summarize, extract data for model fine-tuning, rewrite, etc.
That would be an interim solution before we get truly capable, immediately accurate models.
Not crazy about them inventing what seems like unnecessary nomenclature.
Thanks again, Matthew. Another excellent vid. Concise, accurate, genuine, no fluff.
What's cool is that one can integrate these prompts into any project.
I’m excited! That’s really powerful. Have to check the original sources. Thx Mathew!
Imagine using the patterns in a vector DB (meaningful description and then the output instructions) but instead of using Fabrics pretrained ones they let you build your own in their framework. You can ideally train business ready "templates" to complete and save time and money. I hope they expand on this framework will be interested to see how they can integrate into different environments and allow the user to customize this further!
As always thank you for keeping us in the loop of new tech out there! Much appreciated!
Yes, you can add your own patterns.
@@barrycrowder Ahhh awesome thank you for confirming! Definitely need to set some time and get to know this better it ticks so many good useful boxes
@@chookady222 agreed, seems to have very high ROI(t)
This will be great for OPs/IT to build proprietary business logic patterns for internal workflows, combined looms and have that be a single call for say the sales report to simply have a function called "My Monthly Sales Report" and it ties to all the personal patterns you need to pull and it just runs with just typing in that pattern name. It would be cool to have a subscribe to pattern and have a dashboard page with whatever boxes of info you want for showing your outputs...
This will be like a Stenotype for pulling data with patterns and such.
So massively complex prompts can be combines where the actual Serious Business Dashboard is a tiny single page that any non-tech worker can customize for their own use.
@samstave7079 ideally like you say a IT or Ops team who can focus on mapping the workflows,usage patterns etc would be crucial to getting the accuracies correct for whatever data you loom in. If you took your concept far enough you could do away with the non technical workers and possibly high IT level workers. Pair this with a deployable application that acts as an agent, layered with a LLM of your choice, connection to the loom database, code execution methods such as Open Interpreter and something a bit more versatile to run through browser tasks. You effectively have a pre-trained, defined set of tasks scripted that have been verified and designed around user workflows and business objectives. Build enough looms into the datasets and as you say have a single point of access you effectively create an agent system to execute human tasks, provided you build up a large enough dataset to complete tasks of individual roles within a company
Matt, you are a machine. Thank you for what you do.
This is gas! ⛽️Thank you mister Berman!
Honestly pretty convoluted. Plus weird terminology like loom, fabric etc if one is prompt challenged one can just ask the LLM to create you a good prompt!
Thank you! Very well done.
Receiving your grade sounded as awesome as having to do extra yard work.
I wonder if these prompts could be like written as text on a BBS or SMS? then while holding down the Ctrl key press the letter C, locate an AI interface you want to use it in and again hold down Crtl but press P this time. WA-la!
Love to see instructions on how to use this with ollama and or other open source offline tools.
Awesome! . Thanks for video. It worked for me.
Hi and have a nice day,
my name is René and I am trying to realize a project that connects several OpenSource LLM's with an agent framework. Each LLM is supposed to solve a specific task. Unfortunately, I can't describe it as well as I would like to, but I'll give it a try.
It is about receiving, unpacking and analyzing a PST file. There should be a separate trained LLM for each area of the analysis. For example, an LLM that graphically displays the connections of the suspect to others, one that is trained with different languages to search in all possible languages, .... so the whole thing is briefly outlined.
Since I am still relatively new to this area, I would be very happy to receive tips and tricks.
Thank you and best regards from Austria
this might help you
there is GORRILA AI (a model 2 agent )
To create a LLM management process that both searches & locates the right LLM for a task
then also executes as an agent (making API calls [remote website W] to automate original request)
Yes awesome project! FYI I’ve made a bunch of Pattern demos as Shorts
Pro-tip: I always use venv to create customized python environments for any AI project. Stable Diffusion front-ends actually use "venv" by default.
Create a custom Python environment in your home directory (~/venv)
python -m venv ~/venv.
Enable it for Mac/Linux unix users
. ~/venv/bin/activate
Enable it for Windows users
~/venv/bin/activate.ps1
May need to replace "bin" with "scripts" in some cases.
Can you make a recap video with a summary of the tech you propose and which one you suggest for each category?
I made something like this yesterday made a nice collapsing scroll menu with a bunch of sample questions that you can just click on.
How?
Im struggling so much to get this running with Ollama locally. I cannot seem to get fabric to check for local models. I'd love a video on this!
ok you'll get a 👍 for that one (for being an actual usage case; haven't tried it out yet though)
Unsolicited shortcut: For selecting all of a transcript, you can use the "End" key, after highlighting the beginning and it will highlight all of the transcript. Use the "Home" key after highlighting the bottom.
Thanks for this video. Very interesting. Would like to seem more use case examples from the available prompts if you have time to put something together.
Agree super cool project. FYI - I have some shorts on different fabric patterns.
Matthew's smirk when he says I am gonna revoke this key before I publish this video is insane.
If you are on windows 11 using WSL, pbpaste won't work (naturally) you need to install xsel and use replace (without quotes) "pbpaste" with "xsel --clipboard --output" | your fabric pattern. This should do the same thing but on linux system or windows using wsl.
explain WSL
if you going to try to be helpful put the effort in to introduce whatever tools & references you using
Using Powershell Get-Clipboard seems to do the job of pbpaste
@RafaGamesPT that's another one too.
`win32yank.exe -o | fabric --pattern summarize` worked for me on my wsl2 installation (with Ubuntu distro)
this kind of ai is going to be killer stuff for everyday use, needs to be packaged a little easier though. early days yet.
Everything sounds so exciting until you see the installation process and the missing UI afterwards. Than the reaction is just "meh"
if you cannot use a terminal that is on you
*Can you imagine a better way to sharpen your CLI skills?*
Iron sharpens iron, fren 🫡
lmao, same as most 'coders' stuff, clueless about the real public lol
Cool project. Shouldn't be too hard to write a small frontend with a dropdown list of all prompts for this.
Can you show a video on how to feed a .epub book for analysis? Being able to summarize a book like that would be incredibly useful.
Convert epub to PDF and then send it for analysis. Lots of PDF readers
Awesome information
I didn't understand it at all but i LOVE It
can you make a video of this with groq?
Hi, can you please make a video how to setup fabric how to use local ollama LLM:s?
Interesting. I just want an LLM that can load my entire Java Spring Boot project. Even with doing @workspace command with github copilot its not able to keep track of context. LLMs are getting a lot better but I will be amazed when I can walk it through how I want my website to function and it gives me project wide solutions like update xyz file with this and that code.
The zsh error happened because of the poetry setup. Either the script didn't force the newline or it was manually edited, there wasn't one put in.
I've had some thoughts about whether something like this would exist. I wonder how they eval prompts and if there's any mechanism that might dynamically determine prompt effectiveness.
DSPy is an open-source Python package that is largely made for this purpose. You should check out the docs.
Just seems like a prompt creator, no?
How do we use this tool to continue the conversation?
For example, we might have a directive to respond to a certain type of follow-up prompt. Maybe we have a directive to respond to the user with follow-up clarification questions, etc.. How would we do this from the CLI? Or do we need to use this in a different manner (assuming what I am asking about is possible).
So I guess that what I am asking is whether it is possible to create and rejoin as session.
Wisdom, what does "to extract all the wisdom" mean, like a categorized summary?
Sooo just to check i could just lift these prompts and use them anyway i like without the nightmares of installing fabric? Perhaps it should just be a free prompt library???
Just FYI, the pbcopy / pbpaste is a MacOS exclusive, but still super userful.
When i have enough fabric can i make a blanket?
So this is some kind of prompt software that API's gpt4?
Is there a way to change the GPT model this tool is trying to access? Currently, it is looking for GPT-4 turbo preview, which results in an error. I have an OpenAI account with some money in it.
Could you make a video of this with Llama3 in a local computer please?
Hi Matthew, always thankful for your Content, but could you please first give some Showcase of the solution, so we can know if it is worth spending time for this?
I've been thinking about this a lot lately. Usually I do install and then showcase. Maybe I need to switch the order?
@@matthew_berman if not switch order, at least give a sneak peak of the possible results, so like he said, we know if it's worth spending time on it, or just inform about it. Else, like I did, people just probably skip to the showcase part.
@@matthew_berman Human Brain: Incentive>Motivation>Work. I always have Pavlovs Dog in Mind 😎
@@matthew_berman Incentive>Motivation>Action.
Are there any open source or close source large action models? I'm not able to find any
Also interested to know if exist
The only company to actually confirm work on a LAM is rabbit, and theirs is going to be locked to their device I assume. There's rumor openAI is working on one, GPT5 could be a LAM... Short answer is no. Still waiting on that.
thank you@@thomasj0330
Regarding the error missing "ts". Remember you can put the error message into an LLM and resolve it.
For me only gpt 4 or llms alike are abre to resolve those kind of things kmao
The majority of Python related AI projects don’t work on my system. I can’t imagine I’m the only one. Even he had issues just getting Poetry to work. Why do Python devs put up with this?
It looks promising but I’ll wait for a UI.
Can Fabric be used without any API keys? Or are there any free models to try it out?
is this installation for mac ?
because i'm getting this error:
The term 'git' is not recognized as the name of a cmdlet, function, script file, or operable program.
./setup.sh. doesnt work and ask to use pipx install but I don't know how to do it ? anyclue ?
Fabric is basically, Metaprompts in the backend.
Elon says they are open sourcing grok this week, exciting.
Hi @MatthewBerman, thanks for the video.. it looks like fabric gonna end up being a "swiss knife" AI tool.
I am having problems setting it up/installing it on ubuntu on my Windowss 11 PC. Everything worked well until I get to running: fabric --setup
I get an error message that says "fabric: command not found". I have tried looking for a solution online, but found none.
It will be greatly appreciated if you can spare a few minutes of you busy time to help resolve this.
P.S. If anyone else has the solution to this problem, I welcome their guidance!
Does it work with others or only chatGPT?
What do you do if you use windows with no pbpaste?
next: how to use it to generate images, running all locally
You should consider keeping all important info like the terminal views with in the title safe of the screen. I can’t follow your tutorial when you are using the cli.
Just a few seconds into the video - I think the solution to your screwed up Python environment, or to not screw it up in the first place, is to use a virtual Python environment for each project?! For me, on Debian 12, that's almost the only solution to not having to constantly read that pip is about to break the systems apt packages.
Can someone tell me which programming should I learn first in order to do all these ai stiff. I also want to understand what I am doing and not just copy paste everything. Thanks in advance.
python
Thanks for this. Tried to install this before, but it failed. - Still the same issue on windows when trying to run ./setup.sh. running it from terminal it pops-up another terminal very quickly and closes. Can't continue the steps after that since nothing is working.
you need to use xsel on windows WSL, pbpaste is an exclusive Mac command.
Error while cloning the code
Fetch-pack: unexpectedly disconnect while reading sideband packet
Fatal: early EOF
I repeat same method it still give me this error
Any help?
How can us Linux users do pbpaste? Can you recommend another way to get this functionality in a Linux terminal?
I figured out that I can use xsel to achieve the functionality of pbpaste! Now I am trying to figure out how to use the youtube API so i can use the extract_wisdom pattern on yt videos.
How can we use Google Gemini API with this?
Sorry guys for interupt you by commenting sth not related to this video, but I suddenly seen some weird case for using gemini pro 1.0 and GPT 4 in ChatGPT
The case is: I have some unreadable pdf which contain text, but none of any tool can extract the text inside it as it is image of an word, then convert to pdf file. I tried to screenshot the file, then give to both gemini and GPT 4 to OCR and something weird happened:
1. Gemini refuse to give me the extracted text (I only need extracted text only). Try many time but the same result, it refuse that it cannot assist with this
2. GPT 4 give me the wrong OCR. It performed good at the first few pic, but after that, it happen in 2 case:
1. It give me the wrong content for extracted text, some part seem to be hallucination.
2. The second is totally wrong text: The right text is: "Because of the firm's working capital management pratices are affected by...". Instead, it gave me "The firm’s working capital management practices are influenced by...". Somehow, in this case it start to recognize something and give me totally wrong text, but the same meaning
Have anyone seen this before? How can we control the LLM behavior in this case? It seem pretty weird, cuz my instruction is extract the text only, it did very good at the start, as the conversation start to increase, somehow the best model lost its attention
Hopefully some professional in AI and LLM can explain this. Is it related to losing attention due to the long conversation?
Maybe make the temperature 0.1 or 0 for the API calls
Maybe the document contain some kind of hidden text to avoid you do what you are trying to do.
Image PDFs are the worst: huge and UNstructured
On Mac, to install fabric, the "./setup.sh" did not work. But "pipx install ." worked.
After installation I keep getting this error:
"Error: Error code: 404 - {'error': {'message': 'The model `gpt-4-turbo-preview` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
Error code: 404 - {'error': {'message': 'The model `gpt-4-turbo-preview` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}"
It's like a collection of python scripts+prompts for a collection of specific use cases but so far not very impressive. There are much more interesting things out there evolving like people making easy user interfaces for langchain. I would love if all this people collaborated to improve and accelerate together their "life automation" tools but we don't live in a perfect world.
How much ram required?
Works on my Mac mini so not an above average amount
Fabric vs OpenDevin vs Devika - What's better?
Please make a video on the BitNet paper from Microsoft! 1 bit llms
I cannot see any automation, these functions can be performed by a custom GPT, did I miss something?
./setup.sh didn't work at all (not recognized as external or internal command). typing just 'setup.sh' without the './'' brings up a window that opens and closes way too fast for me to see it, aside from that... nothing. pipx install after that does nothing, typing 'fabric -h' is not recognized.
🎯 Key Takeaways for quick navigation:
00:00 🛠️ *Installing Fabric and Understanding its Use Cases*
- Fabric is an open-source project designed to solve everyday problems using AI.
- Fabric serves as a library of tried and true prompts generated and reviewed by the community.
- Use cases of Fabric include extracting interesting parts of videos, writing essays, summarizing academic papers, creating AI art prompts, and more.
01:12 🔄 *Installing Poetry and Setting up Fabric*
- Install Poetry by running the provided command in the terminal.
- Run the setup script to initialize Fabric.
- Restart the shell or open a new terminal tab to apply changes.
03:13 🛠️ *Setting Up Fabric and Exploring Terminology*
- Configure Fabric by providing the GP4 API key using the setup command.
- Explore Fabric's terminology: mill, pattern, stitch, and loom.
- Use the fabric CLI to list available patterns and understand their functionality.
05:04 📚 *Extracting Wisdom from Content*
- Use Fabric's pattern to extract wisdom from content like videos or articles.
- The "extract wisdom" pattern retrieves key ideas, quotes, facts, references, and recommendations.
- Analyze claims and receive an overall score based on the extracted wisdom.
06:37 📊 *Analyzing Claims and Concluding*
- Fabric can automatically analyze claims extracted from content and provide supporting evidence or refutations.
- Each claim is scored, providing insight into its credibility.
- Fabric offers a comprehensive solution for leveraging AI to enhance various tasks.
Made with HARPA AI
Why are we going in reverse direction? Such tools increasing complexities
it is not clear by the author how to use it with a local AI like llama. did you find it?
I have had to take a step back from these projects. Too messy that even pinokio won't install/run. Even after attempting to uninstall, deleting files and registry keys
Why can't stuff just be written in c(++) again, with low deps, and (c)make
files...
you need a RTX 30 or 40 to run it locally?
Not "need" but the better model you can run and the fastest it runs the performance of the whole thing improves exponentially. Better LLM means bigger and then you need more Vram in your graphic card. Faster inference means more cores in your graphic card so more parallel proceses can be run. Everyday appear improvements but i doubt there is something "magic" that allow run big models locally and fast around the corner, not in less than 1 year. Right now the best you can get at consumer level is a nvidia 4090 or maybe 2 connected together. For the 2 cards setup you would need special motherboard, special powersource and special case, that would mean around 5K$ in hardware.
@@rootor1I can run up to 13gb with a gtx 1650.
@@rootor1search about the channel Aisphere and the vídeo running llms offline on a gtx 1650
czcams.com/video/6_qrS5OAPXo/video.htmlsi=rJodlE3MxCRTHxCe
IaSphere lllm offline run gtx 1650 @@rootor1
Search and be happy
Windows holds 72% of the market share. Please do this for Windows:)
Actually to install fabric: you need a needle!
Admit it... Thats a really good point
Cut it out
Why you gotta take a stab at me??
Just poking fun lol
Trying to hold this thread together
Zip it!
Fair enough, you reap what you sow I guess...
Let's just put a pin in this till I can stitch something better together....
Is there a feature to help handicapped people? APPS for PC?
What feature and what kind of handicap?
anyone could integrate it with local ollama?
Fabric is right: over-simplification sum up pretty much your approach to AI
Wth do you need this complex method to optimize prompts?
With an 8k context window you can litterally create a custom GPT for this without using knowledge files.
In my view you should ditch conda and live with pyenv+poetry.
Does it support Gemini API??
Yes, supports all the apis compatible with the style that openAI made theirs (for example mistral, groq or local inference frameworks like llama.cpp), nowadays that api format have become a de-facto standard.
Looking forward to an answer to this as I owe OpenAI $124 and they won’t let me use their stuff any more 😢
I don’t know that Gemini implements the OpenAI API style enough to work. Also there is no place to set the API URL in fabric config. 🤔
Don’t say you’ll revoke the key and let them suffer
First. Fix me plz 😂
so its just... a list of prompts?
Classic "it's just..." dumb take minimizing the value of something they don't really understand.
Yes. Would be better to have it as a Website.
@@brunodangelo1146 how is it different it seems like it just has prompts I don't get whats that special either
@@brunodangelo1146 dumb reply, I code with ai daily
From the project readme: "Too Many Prompts: the biggest challenge I faced in 2023--which still exists today-is the sheer number of AI prompts out there. We all have prompts that are useful, but it's hard to discover new ones, know if they are good or not, and manage different versions of the ones we like. One of fabric's primary features is helping people collect and integrate prompts, which we call Patterns, into various parts of their lives." There's more on the project philosophy and mission there but another quick point I'd add is that some of the patterns/prompts like create_threat_model with a 2,800 word essay on how to make them are worth checking out.
Nerds unite!
Any opinions ..they add Larry Summer to the board .... ridiculous
it is unclear how you analyzed a video from the command line, considering the prompt is designed to work with text...
What???? You just copy the subtitles, sure the AI will be able to organize your thoughts, you already said everything in the video, but if you had sent the URL of the video and did what you had show, sure that will be no complains about my part.
shocking
Automate?
Using that is tedious ....
The title was just a trap
First
It has everything i hate in open source. No Installer, errors while installing, stupid fantasy namings (loom stitch etc).
What, you don't like the absolutely 180-IQ company name like "Huggingface"? 😂
Did you read the title? Were you expecting something else?? 😂😂😂
Yeah.. OSS is great, open standards and all.. but quirky sometimes.
Stupid fantasy names? It is named Fabric. You make fabric with a loom. You stitch fabric to make things. Where are the fantasy names?
the audacity to hate something you don't even pay for or contribute in any way or shape