The Drawback of Client Side Rendering
Vložit
- čas přidán 11. 06. 2020
- Client side rendering is cool until you want to do something that it can't do...
----
Checkout my side projects:
If you're into cooking: www.mysaffronapp.com/
----
Join the Discord: / discord
----
Patreon: / benawad
----
Follow Me Online Here:
Twitter: / benawad
Twitch: / benawad
GitHub: github.com/benawad
LinkedIn: / benawad
Instagram: / benawad97
#benawad
TikTok: / benawad
----
Follow me online: voidpet.com/benawad
#benawad - Věda a technologie
Your metaphors are next level.
here's your big mac.
Aaaah I see what you did there...
🎈 🏠 🎈
"you are not a karen"
Ben's got 99 problems, but a girlfriend ain't one.
Bruh! He says it with a straight face
@@phantomKE I know right? He just leveled up with these jokes!
🤣🤣
jokes aside, he's not a bad looking dude at all, getting jacked wouldn't hurt tho, as long as he doesn't covert to a life coach and talk about it non-stop like that other coder youtuber John Sonmez.
He has a girlfriend in Canada.
solutions:
0) pre-rendering with parcel or webpack
1) server side rendering
your solutions are not client side rendering. he mentioned it.
Fantastic job explaining this! As always, the hilarious dry humor and "next level" metaphors help drive home points and keep things entertaining. Really helped clear up a bunch of stuff and get me pointed in the right direction. Many thanks!
I avoid client-side rendering in order to save CPU cycles for cryptocurrency mining.
hahaha
hilarious comment! but crypto mining is an inefficient form of revenue on client's computer, see TPB case experiment.
@@TechdubberStudios It may pay less than ads, but it's many times better. I support websites that responsibly use cryptomining, and I block ads. Please, don't say that ads are better. They have never been any good to anybody's web browsing experience.
Oh, and you can use cryptomining along with Arc, another earning method that does not involve ads.
I'm done with Google's creepy trackers. Cryptocurrency mining is the future.
@@ezshroom I am genuinely 100% with you on the crypto movement. I hate ads. Always have hated them. But there are at least 2....3 big corporations that come to mind that were built on the ads business model, but with crypto mining... can't find one. And browser-crypto-mining is not exactly a new technology. I really want it to replace ads. I really do. Hate the pop-ups, spying, tracking, that's going on. And the first corpo that comes to mind would be Netflix, when considering whom should adopt the crypto model. Because the users stay on netflix and binge-watch hours and hours!
@@ezshroom also, do you happen to know any website/forum/subreddit focusing on browser-based mining? I would really like to join and dig in more into this subject.
I avoided serverside rendering a meta tag by registering a sub-domain, doing the serverside-rendering there and making my app only compatible with a set number of user-agents. Brilliant!
I like how you explain, well done. Thank you for the quality content.
This channel is slowly becoming one of my favorites on CZcams! 😄
This helped me alot! I am working on a project and my backend was almost finished. I was using create-react-app with router but switched over to next.js! Thanks alot
I´ve been watching your videos and yes, the quality of the content is always awesome, new suscriber
Great video. Trying to wrap my head around server side rendering and this video definitely helped
This was the best explanation video I've seen on the matter... Kudos to you Mister...
You are hilarious and informative my dude haha, relatable. And damn dude the lenght of your link
You make my day better
This video was hilariously informational, Ben! Thanks! Haha
Love the joke about girlfriend and client side rendering at the beginning
I love the tint on your glasses, it's serial killer-ish, where can i get a pair like those?
a package arrives at your door after the 3rd kill
@@bawad respect
They are the left-behinds after each kill. That's the way you get it.
@@bawad quick scope no scopes?
Those tints are wiped off blood from killing
I had similar issue. good thing you found a better solution.
Dude I was literally searching the name of this OPG yesterday. Thanks, dude!
there is a workaround : just add conditional tag in the small server that builds your page. you can still use client side rendering except for meta tags
Very useful video as always
I think you could easily do this in net core. In the startup class, in the configure for routing, you could filter each route with the correct meta tags. You could make this an extension and bing bang bosh, neat tidy job done
this was so easy to understand am subscribed
Woah! My self esteem skyrocketed because I managed to keep up with you until the end :D Aside from that, your content is top notch, keep it coming man.
You're great bro 💖💖
A solution to your problem could be to build a single page application, with each end point for the app being pre rendered.
It's basically jamstack. Once a user loads one page, the others do not need to be loaded.
Nice content!
Great vid
I love how Ben roasts Angular devs. I thought of that carrot farmer line off and on all day and cracked up every time.
This sounds brilliant and I would need this for my next mvp
netlify has a free experimental feature called pre-rendering, for me, it works with Facebook, it parses the right meta tags automatically with pictures also. My content comes from a backend via graphql and apollo. meta is being set with react helmet, the page is handled by react-router, and it's a create react app project. Hope this helps. You can also do prerendering very very easily with react-snap package, but you need to rebuild when data changes. (PS. Thanks for your work, I really like your videos)
Nice GatsbyJS colorway on that shirt 🤙
I think I might try react snap. That sounds good. Pre rendering on every build. Because often the layout of a page is the same even of the content changes. What do I mean by that: every reddit post will have the logo, the side bar, the footer and a div in the middle which contains the contents of the post. So you can prerender all that with an empty div, and then Hydrate it. Even with user generated content (as long as it is simple and consistent) you could prerender. Thanks for the video
I had to do that once, I used a Lambda function since it was hosted on AWS, and the function intercepts the CloudFront distribution request and updates the HTML if the request comes from a robot, adding the OpenGraph tags.
Thanks Man!
u watched a 9 min video in 3 ?
@@jinxblaze that is what true supporters do: they appreciate the content even before watching it. it's beautiful.
@@jinxblaze TRIPPLE SPEED
@@ChrisStayte xD
Best explanation, Ever.
Now you can make static pages for you dynamic frequently updated pages with Nextjs, How it works's is that it looks at the requested page and if it is present from the build time, sends it back and if it is not built during build time, builds on the go(run time) and adds it to the built pages for the next request. pretty amazing and game changing !
I use EJS and it allows for variables to be passed before sending the HTML to the client, so that can allow you to change the values in the meta tags.
If the only thing that needs to change is the meta tags (not the rendered bits), you can also modify the html before returning it to the client, inserting the relevant meta tags.
It will probably lead to performance problems, but you could also perform an if condition on the referrer of the request to determine if you should perform such modifications.
For your sake and ours, I hope you DON'T get a girlfriend too soon.
Wow
I don't usually comment on these types of videos
But bro
This is amazing content please keep it up man
Sapper + svelte gives you the best of both worlds
Greate video! I use Laravel on the server side to serve up everything. Static html pages and React apps or a combo of both. It's easy to embed a react app within a .blade template file. Meanwhile Laravel takes care of everything else, like API services, user registration and authentication, etc. Best of both worlds.
It works! I do exactly that with my react web SPAs. I use firebase and cloud functions to detect user agents and serve SSR version on the fly to robots and CSR version to users. This is also important to SEO indexing, cause some robots won't run any JS and expect html-only responses. Really enjoy your videos.
What about some prerender.io ?
@@leisiyox I thought about using it, but never tried it. Don't know how well it works. It would also cost more than my current firebase cloud function solution.
@@cauebahia what are the conditions that you recommend using firebase?
I thought about using it but I seek guidence
@@leisiyox I like that they integrate lots of services in a single solution. When you create a firebase project, you instantly have access to file storage, hosting, database, authentication, and some other stuff that makes it really easy. I also like that Firestore has real-time listeners for your data. Really good for a client side rendered app. Also really like their documentation and the fact that you can easily access other Google Cloud services and API. There are many videos online about it. Check it out.
I had this problem once but my focus was towards crawlers. I ended up using some php to "render" the important bits like title, descriptions and links. Then the javascript would remove those elements and do the single page app business. It was back in carrot farmer code days but I'm sure happy coders can accomplish this just as well.
I like your humour😂😂...great lesson as well
Great solutions
english is not my first language so i understanded only on second time i watched this, thanks for de vid i've not dev a site with link preview yet, very good to know it!
I just used a node.js express server to host compiled create-react-app, this way you can modify the page and add meta tags if needed before serving the page. Sort of a mix of server and client side rendering as he said.
Subbed for the consistent Angular claps 💀
If you manage the web server, you could use the web server's router to do the same exact hack you described without the need for a different subdomain, just a route that checks the user-agent of the client and returns different HTML based on it.
Keep us updated, I'm curious if it'll work and what will be the most difficult part. Also, I did not quite understand why you decided not to use react-snap
Very useful, thank you for pointing to react-snap. Happy Hacking Ben 🙌🏻
I was just watching one of your videos on react native animation earlier xD
Keep up the good job 🔥
I had the same issue last week and also was thinking about moving to nextjs, but having a separate domain and server makes a lot more sense.
well Next js or any other SSR solutions doesn't mean you're gonna use one server for the backend and the front-end.
Hey I saw Wes Bos in one of his videos, he used cloud functions to generate the preview and puppeteer i guess to take a screenshot of the url
You are my spirit animal dude.
"It's like I spent a bunch of time building a house and now I want that house to fly." LMAO
I love your humor
hi ben i dont know what you are talking about, i am addicted to listen to you, may be it will start making sense someday, i am still learning react and some other frontend libraries.
2:05 i do it this way:
Server serves response for parsers (meta, og, schema, jsonld and plain html content) and then comes along js that structures it up and takes over routing from this point, so when you navigate you actually don't "refresh"
I guess u never heard of prerender.io
been using it for years
Lol just killed the whole argument. Never heard of it before. Just goes to say that tech is exponential. Wonder if it will cause the cosmic crash eventually.
yep yep yep , you just commented before me
@@ayushkhanduri2384 same case, I just searched prerender before I add a comment about it to check if it's already mentioned and here it was.
🤯
Your solution at the end is valid. Use reverse proxy to detect the request, and forward them appropriately.
However, it's best to use SSR from the beginning if that's your intention.
Solution: NextJS, Angular Universal, Nuxt, etc.
Also check out the create-exact-app npm (that's exact not react). Like NextJS but Express-forward design, full control at the server side level of what's going on.
@@jpsimons Just FYI, next.js also gives you full server side control. You can just run next as a library within an express server. In my experience, it's super ergonomic while preserving the state-of-the-art benefits of next (code splitting, automatic static optimization, incremental static generation, etc.). Having said that, I have not yet checked out create-exact-app, and am not sure how it differs from nextjs.
Why do I not like the sound of Angular Universal?
@@angshu7589 because you are not carrot farmer. Although color of your profile picture kinda resembles the carrot :D
@Adithya R Svelte Sapper is still in early development. I love Svelte, but Sapper is still far away from production-ready
Gatsby also solves the React single-page problem, since we can generate all the individual HTML, CSS, and JS pages.
I love your humor.
I would use the same client-side bundle BUT adding a little bit of logic on the static assets server to add the meta tags to the HTML shell that embeds the client-side bundle. That way, you won't need to implement HTTP redirects, AND probably is better once you start working with deep-links for a mobile app.
Every time I have a problem with my apps I just wait for Ben to have them to so he can solve them for me.
this is actually the first time I understood the difference between client side and server side rendering
Nice video Ben. Try this out and make a video about the results please.
I was wondering what the previews you see on slack or text messages were called. Thanks
Benawad officially a CHAD?!
I will learn WebD so that I can enjoy these digs by Ben😂
Cool stuff about magic links. Even if you hadn't talked about that, you mentioned 🥔. Automatic upvote.
Going through a similar issue myself. My static site is hosted on S3 / CloudFront, orchestrated by terraform. My plan is to use CloudFront Origin Response triggers to trigger a lambda function to add the correct open graph tags to the response. I think this is the lightest weight option.
Did you do it? Am exploring some of this ideas :)
cool dude
You can use the header trick as you mentioned, and then simply use something like puppeteer to load the page on the server itself and then send the rendered HTML page to the client. If the header says it's from a normal user, then don't go to puppeteer, just return your usual index.html file.
I got this idea from here: czcams.com/video/lhZOFUY1weo/video.html
I have the exact same T-Shirt!
I think he meant that we should use Next / Nuxt from the beginning. I used to face these problems and since then I use Nuxt for every project and never worry about these problems again
my solution for this is:
1. set a lambda as an index
2. on the index lambda, it should run a headless tool (like a headless chrome) to render the page if the request is a bot, else just serve the raw js
Nice
Our course -- I followed and understood all the way to the end. This is because I'm an unemployed ex-Tech Lead [who has never worked at a FANG company], and a thousandaire.
I just put my meta tags with variables like %PAGE_NAME% %PAGE_IMAGE% and replace these later while serving the page with express. doesn't work while client-side routing but It works for link previews.
Had this issue while using a MeteorJS website running with ReactJs as the client side is that we created a crawler ( i think there is an npm project for this) that would go to every page, render it and save it on our DB. When a non-human (Google) would access the site it would served this rendered HTML, making it SEO friendly. Basically our server would use the User Agent to define what type of content the user would get served. Hope this help
can you please explain what do you mean by 'render it and save it on db' , do you mean like render the dom elements and attach to html and store it in some link and add that link in db , or what exactly are you storing in db, if this is the case wouldn't it be too much for pages which are dynamic like /rob/photos, /sam/photos and likewise to be stored in db or am i missing something
Facebook's user agent is there for the facebook app browser as well.
Nice solution. Only downside guess would be that Google wants you to show the same content to their bot as the user. But probably doesn't matter if you don't want to index those pages.
This video is 2 years old, I don't know if it existed when you made it but today there is a tool called puppeteer that uses a headless Chrome browser to actually pre-render your HTML that you can feed to bots and still feed the client side rendered to humans
Hello Ben, Have You thought about differences between react-snap and NextJS/Gatsby from SEO perspective? I mean is there reason to use NextJS instead just react-snap to get better results in search engines? Does NextJS/Gatsby do something extra to perform better in SEO? Regards
For some wierd cases like mine where only /particularRoute need to work like SSR what i actually tried was hosted gatsby project in particular route of CRA project and it worked just need to handle few re-routing cases
i had the same problem , i solved it using react helmet and react snapshot, it's important not to put the og meta tags in your html , only put it in your react helmet dynamic page content , if you got og tags in your html facebook and other platforms will see both og tags and it will take the one on the index.html
Could you use something like pug generate a html file like the normal one that just contains the with meta tag links that have and just send that instead of the normal blank html (instead of having to different by client) ?
Well a simpler but similar way would be to always server side generate the index.html but only to change the meta data while letting the index.js generate the actual content.
This wouldn't hurt human users and is pretty light on the server (only entry point html meta tags need to be generated depending on the link then the rest of the rendering navigation is done client side)
The preview still won't work when users copy paste the link directly from the browser url bar
Hey Ben,
I'm starting a new react web app for something like delivery.com and doordash.com where SEO is a major thing.
Like rich structured snippets of stars etc in google results.
Is SSR the only option?
If your content could be pre-generated (for example, blog), one of the options would be to use Gatsby.js
yeah
In that case Gatsby or NextJS or any other you recommend?
I don't know if Gatsby will do but their web.dev score hits 💯 so was thinking that but in a dilemma
Next.js
@@bawad do you know an easy way to migrate from create react app to next.js? I've found problems with REDUX, svg images and react router active route (NavLink). Great video btw ;)))
I liked the idea however, I think it is still good to have SSR for all users or maybe SEO as well.
Would you share your ideas on uFrontends, too? Are you preparing some sort of tutorial on that or what?
Hey Ben. Can you make a video of a list of all the terminology a web developer should know. Client side rendering ect..?
Just make a simple API where the client can send urls and it will return a preview. Just make sure to cache the pages and add rate limits so they can’t spam your api.
the girlfriend problem might be solved if you stop walking around wearing asexual flag shirts
hahaha
lmao, good catch, respect
He's just playing hard to get. Karen gets it.
But with this if he ever gets one, she will be the right one. Lol
He do check a lot of aesthetic boxes from the virgin meme... Though I probably do too 😆
3:17 thanks for explaning this here lol, i was like why wouldn't you be able to just stick the meta tags in the requested html
Sounds similar to how to generate PDF report (E.g. invoice) from rendered content. My solution was to open the link in a headless browser (like puppeteer) on server and save the rendered result. Then perhaps implement some kind of caching. It is simple but slow.
7:20 would it be possible to use the same url but check the header value on the for example nginx server?
like is this user agent a bot (twitter, fb, etc.) => proxy to your slim API for only the meta data response
and if its a real user (mac, windows, chrome, firefox user agent etc.) => proxy to your real page / default response / SSR page.
maybe im forgetting something. i dont know if this could work.