ElasticSearch and Ruby on Rails - Part 1
Vložit
- čas přidán 30. 05. 2024
- Recently I have been using two databases in my large Rails projects - 1, an RDBMS like MySQL, as a data store and another - ElasticSearch - as a denormalized document store that drives the front end data.
In this video I go over how to get ElasticSearch into your Rails project and get the data into ElasticSearch.
There are tons of uses for ElasticSearch once you have it in your app and taking in data. This video really just scratches the surface. If you want to know more let me know and I'll make more videos about it!
Find the source code for the sample app on Github:
github.com/philsmy/elasticsea...
00:00 Introduction
00:32 What Is ElasticSearch and How I use it
03:05 ElasticSearch install info
04:02 Create Rails app - and talk about ES advantages
08:15 Install elasticsearch-rails
10:55 Create Models
14:20 Create Dummy Data with Faker
24:45 Add Elasticsearch to the model
38:30 Denormalizer
42:20 Look at the data in Kibana
References:
ElasticSearch Rails: github.com/elastic/elasticsea...
Elasticsearch: www.elastic.co/
Faker Gem: github.com/faker-ruby/faker
---
Some people online just talk - I just do. One of my major SaaS products out there is Zonmaster.com (public.zonmaster.com) THE AutoResponder and Customer Management tool for Amazon Sellers. Join over 17,000 other Amazon Sellers and sign up today!
Looking for great hosting? I've been using DigitalOcean for 7 years and they have never let me down. Amazing.
Check them out here m.do.co/c/f1c6edf8597f and get $100(!!!) free credits towards the cost of servers.
#techEntrepreneur #rubyOnRails #softwareDevelopment - Věda a technologie
Thanks for sharing Phil this gonna help me a lot in my next project
Glad to help
Very good intro and super useful
Nice practical approach Phil, thanks for sharing.
Thanks for watching!
Great rails video as always! Pls next time increase the font of your terminal and vscode
Cool. Resolved :)
Yes, in a couple videos I got all excited and forgot to check beforehand. Now I have a pre-flight checklist! Thanks for watching.
Amazing content Phil, thanks for sharing
Glad you enjoyed it!
Congratulations for the content! I learned a lot from your class.
Greetings from Brazil!
Thank you very much!
Sempre bom ver brasileiros, eh nois ❤
Great stuff, looking forward to more videos
Thanks for watching! And more shall come!
Great video! thanks for sharing
this video is great for learning!
hopefully I can be mentored by you!
Thank you so much!
Exelente Video,
new sub phil thanks for sharing this great tuto
Thanks for the sub!
Loved it!! :)
loved it!!
Thank you!
Great video
Thanks!
Very helpful
Thanks!
I am curious on performance, is there need to prefetch the relationships inside the `as_indexed_json` method, to avoid N+1?
Thanks a lot!
My pleasure!
Hey Phil..
In the purchase_order_denormalizer you created a fulfilment_fee key which was a combination of a quantity * cost of some of the order items. This logic now resides in your denormalizer file used to generate an order_items_info blob specific to elastic search. In terms of DRY'ing up this code later, it feels to me like this fulfilment_fee logic would be accessed numerous times within the application and would be better off being at the business logic level of your app. Do you think this would be best housed as a method on the purchase_order model? You could then call the purchase_order.fulfillment_amont within the denormalizer file to access it?
Interesting point! I guess it could be a method in either place. In theory all displaying of data comes through the denormalizer...but I see what you are saying. For me, the model is the raw, uncalculated (as much as possible) data, and the denormalized data is the representation that we would use on the site.
If we start going through my code looking to DRY it up we're going to be here all day! :-)
Great video! Never seen denormalizers but they seem to do the same thing as serializers?... Is there a particular reason for chosing denomalizers over serializers? Is it a lot faster since serializers need to format the data for each request, whereas denomalizers format the data before storing it in elasticsearch??... Or is that not how it works?
I think it is just wording, but, a `serializer` simply takes a record and writes it out (serializes it!) to something (either a database or a file). In essence when you write a model to the database you are 'serializing' it.
A `denormalizer` takes data from SEVERAL MODELS and combines it all together (denormalizes it) into a single record. Therefore denormalized data is a lot bigger as it contains 'duplicated' data and you are losing the concept of an Object Relationship Model. BUT the plus side is that when you query something that has stored denormalized data - if you denormalize it right - you have everything you need - BAM! - in one call.
Hope that helps!
@@PhilSmy thanks for the extensive explanation, it certainly helped 👍🏻
Thanks!
Welcome!
btw what the news tab you have in chrome ? (default tab)
It's a cool thing called Toby. It lets you create bookmark collections. I guess that's what you're talking about! Thanks for watching!
Phil - the question that we all want answered. Did you drink your water? Your reminder was at 10am and it's the afternoon and you didn't hide the reminder. We're worried about you Phil, please drink your water! ;)
Thank you for your concern! Alert goes off once an hour - drink a glass of water an hour!
@@PhilSmy good man! Great content too 🙏