Stefan List
Stefan List
  • 7
  • 76 328
Introduction to Recording Rules - Use Case: Grafana Loki Metric Queries
In this video you will get to know what recording rules are, when they are used and how you can configure them in Grafana Cloud.
#grafana #loki #dashboard #observability #prometheus #recordingrules
00:00 Introduction
00:34 Recording Rules Motivation & Overview
01:18 Example and End Result
03:03 What are recording rules?
05:29 Use Case Overview and Queries
08:20 Step by Step Guide for Creating Recording Rules in Grafana Cloud
11:10 Recap and Best Practices
12:27 Additional Resources
## Queries
### Original LogQL Query:
sum by(method) (count_over_time({cluster="microbs"}| logfmt | method =~ `.+` [$__range]))
### Recording Rule:
* Name: method:microbs_loglines_per_minute:sum
* Query: sum by(method) (count_over_time({cluster="microbs"}| logfmt | method =~ `.+` [1m]))
### Query the Recorded Metric
sum_over_time(method:microbs_loglines_per_minute:sum[$__range])
## Links
- Recording Rules Best Practices: prometheus.io/docs/practices/rules/
- Grafana Cloud: Configure Recording Rules: grafana.com/docs/grafana/latest/alerting/alerting-rules/create-mimir-loki-managed-recording-rule/
zhlédnutí: 483

Video

6 Easy Ways to Improve your Log Dashboards with Grafana and Loki
zhlédnutí 30KPřed rokem
Are you new to Grafana or Grafana Loki or both, then this video is for you. It shows you how to implement a log dashboard in a few easy steps, that goes beyond the scope of the Explore mode of Grafana. Blog Post: grafana.com/blog/2023/05/18/6-easy-ways-to-improve-your-log-dashboards-with-grafana-and-grafana-loki/ 00:00 Introduction 02:02 1. Adding Template Variables for Log Labels 03:16 2. Addi...
Demo: Grafana Faro and xk6-browser
zhlédnutí 1,3KPřed rokem
This video shows how to collect front end telemetry and how to efficiently test front end telemetry using Grafana and k6 tooling.
Use the Grafana Stack to Drill Down from High Level KPIs to the Underlying Issue
zhlédnutí 7KPřed 2 lety
In this short demo video, you'll see how you go from a very general technical overview to quickly finding a problem in your micro services architecture based applications. With a single click you jump into the relevant endpoint trace, gather the corresponding meta data and with another click nail down the log line that tells you why your application is failing. Metrics, logs and traces are stor...
Demo: How to Enrich Data Using a Key-Value Lookup in Apache NiFi
zhlédnutí 8KPřed 4 lety
In this demo I show how to enrich data with a look-up to a RESTful API based on a certain parameter in the data set. More information on this blog: datahovel.com/2019/11/14/big-data-and-stream-processing-101-part-4-how-to-do-a-simple-key-value-enrichment-in-apache-nifi/ Watch this video to create the basic flow: czcams.com/video/OHLYJUOTaYc/video.html&t=
Demo: How to Connect to a Relational Database Using Apache NiFi
zhlédnutí 29KPřed 4 lety
This video shows how to connect to MySQL and query records to be used for further processing in Apache NiFi. More details on the blog post: datahovel.com/2019/11/09/big-data-and-stream-processing-101-how-to-connect-to-a-rdbms-using-apache-nifi/

Komentáře

  • @fumaremigel
    @fumaremigel Před měsícem

    Great video! Please make another one like this. For prometheus maybe? Or tempo

  • @VenkateshMurugadas
    @VenkateshMurugadas Před měsícem

    You saved me a lot of time. Great video

  • @joffemannen
    @joffemannen Před 2 měsíci

    A more concrete question maybe? Was counting top user agents, but now that our traffic has increased we have more than 2000 different user agents per time unit and I run into the max series issue, with a query like topk(50, sum by(user_agent_original) (count_over_time({deployment_environment="prod", event_name="request"} [$__range]))), where I naively first thought the topk(50 would protect me from that limit. It's an instant query, showing a table view with the values as a gauge. I could parse the user agent harder to get major browser version to get the options below 2000, but this is structured metadata, so I can't do that in LogQL, I have to do it in the collector (or in promtail?). I can't increase the 2000 limit, and I don't want to. Any way to rewrite the query to come around this issue?

  • @ygdrassilperlas5284
    @ygdrassilperlas5284 Před 2 měsíci

    How do i know which metrics data source it will be stored?

  • @nguyentuantu7017
    @nguyentuantu7017 Před 2 měsíci

    very useful and reality

  • @DiTsi
    @DiTsi Před 4 měsíci

    Great! Thank you

  • @SiddharthKaushik-gn7fh
    @SiddharthKaushik-gn7fh Před 5 měsíci

    Excellent explanation

  • @saeedsafavi26
    @saeedsafavi26 Před 6 měsíci

    It was awesome, I learned alot ❤

  • @joffemannen
    @joffemannen Před 7 měsíci

    Nice! Got me going. I'm new to LogQL and Grafana, got some Splunk experience and am struggling to translate what I have. But this was a nice start. Any recommended youtubes as next step? I'm still struggling with a few things: 1) The base query is implemented in each panel - a lot of maintenance and I guess the query spends CPU x Nbr of panels. 2) I have a few regexes, I guess I should consider implementing them in the proxy infront of loki so they are available in simple filters for performance and maintenance. 3) The drill downs with Data Links - I only manage to do them in one level, and what corresponds to your "cluster" filter gets stuck for some reason - I want to drill down like 4 levels without making 4 dashboards with 8 separate panels with separate queries because that's a lot of maintenance. 4) Doing some arithmetic, I guess I have to learn transformations - like error rate in %, not in "per second". 5) Combining similar values in the same graph - some of my log entries have 4 timings - time to first byte, request end time etc - right now in 4 panels. 5) Do the same but for logs in BigQuery. I'm sure I'll figure some of this out on my own but one more kick in the right direction would save me some pulled hairs

    • @condla
      @condla Před 5 měsíci

      Hey, quite a lot of questions for s small comment block 😅 but let's try: 1) have a look at this grafana.com/blog/2020/10/14/learn-grafana-share-query-results-between-panels-to-reduce-load-time/, plus Loki does a lot of caching, also take a look at recording rules to speed up high volume Loki queries czcams.com/video/qGyoJPUIOz8/video.htmlsi=BtYmT94Bt5_U21O3 2

    • @condla
      @condla Před 5 měsíci

      2) it depends, generally you want to set as few labels as possible and with a rather lower cardinality during ingest; also it's rather bad practice to set a label is something that already is in the log message. On the other hand, at query time, you want to use as many labels as possible to speed up the query.

    • @condla
      @condla Před 5 měsíci

      3) Grafana Scenes enters the conversation. "Hi there, let me help you" grafana.com/developers/scenes/ Scenes can help you achieve the connection of multiple dashboards while keeping the context when jumping back and forth.

    • @condla
      @condla Před 5 měsíci

      4) Yes, learn transformations, but you can * also do arithmetics on queries directly * depending on the panel type you will already have suggestions in the suggestions tab of the visualization section that show %

    • @condla
      @condla Před 5 měsíci

      5) just click "add query" below the first query of panel and add as many queries to one panel as you want.

  • @photographymaniac2529
    @photographymaniac2529 Před 7 měsíci

    This is not working for unstructured logs where we use pattern to match

    • @condla
      @condla Před 5 měsíci

      Hey, generally speaking it's working the same way. You just need to define the pattern or a regex first to extract the information your want to visualize. Which metrics do you want to extract from which type of logs?

  • @bulmust
    @bulmust Před 8 měsíci

    It is great. Thanks.

    • @condla
      @condla Před 5 měsíci

      Thank you 🙂

  • @girirajb.c3673
    @girirajb.c3673 Před 9 měsíci

    can you send me your promtail configuration for the above dashboard please?

    • @condla
      @condla Před 5 měsíci

      There's nothing notable done in promtaill. What's your challenge?

  • @mex0b0y
    @mex0b0y Před 10 měsíci

    Thanks bro! It's amazing explanation how using loki more effectively

    • @condla
      @condla Před 8 měsíci

      thanks, I'm happy you found the video useful!

  • @AjayKumar-lm4yr
    @AjayKumar-lm4yr Před 11 měsíci

    How to store Grafana Loki logs in Azure Blob Storage

    • @condla
      @condla Před 8 měsíci

      there's several ways you can accomplish this. Either host your own Loki and use Azure blob storage as a storage layer or ingest the logs into Grafana Cloud Loki and configure an export job (grafana.com/blog/2023/06/08/retain-logs-longer-without-breaking-the-bank-introducing-grafana-cloud-logs-export/)

  • @iyiempire4667
    @iyiempire4667 Před 11 měsíci

    just type simple query: fields @message | filter @message like /$Filter/ | limit 100 dont make it hard

    • @condla
      @condla Před 5 měsíci

      Hi, thanks for the comment. this comes with a trade-off and any kind of query language has a certain learning curve. I'm trying to reduce the one for Loki with this video. In the near future you will see Grafana implementing an explore UI that allows you to query and aggregate logs without any query language at all. But users can still make use of the power of LogQL if they want

  • @MattFine
    @MattFine Před rokem

    This was very well done. Thank you. Please continue to make additional videos like this tutorial.

    • @condla
      @condla Před 8 měsíci

      thanks for your kind words! considering this ;)

  • @abhishekkhanna1349
    @abhishekkhanna1349 Před rokem

    Can you please share the application you used to create this dashboard?

    • @condla
      @condla Před rokem

      You mean application as in Grafana for creating dashboards? And Grafana Loki as the solution to store and query logs? I'm confused because I put this in the title. If you search online you should be finding tons of resources for both.

    • @abhishekkhanna1349
      @abhishekkhanna1349 Před rokem

      @@condla I wanted the application code which was generating the logs and trace. Thanks a ton for the video !!

    • @condla
      @condla Před rokem

      Ahhhh 😁 I've used a dummy observability application that can be deployed to test things like this: follow the link for more information microbs.io/

  • @krzysztofwiatrzyk4260

    You have presented using ad-hoc filters perfectly to learn! Thanks you dear sir, I was trying to understand it from Grafana docs but it is just overwhelming.

    • @condla
      @condla Před rokem

      Thanks for the feedback... Anything else that's commonly used but needs clarification?

    • @iggyvillanueva2022
      @iggyvillanueva2022 Před 11 měsíci

      hi, is there a way to get the difference of the timestamp for us to get an api latency and do a trendchart@@condla

  • @bganesh3413
    @bganesh3413 Před rokem

    please dont add music br it is very disturbing

  • @agpjustordinaryviewer

    Currently my office is working some pilot projects to have centralized logging and metrics dashboard using Grafana Loki. We found out Grafana and Loki are powerful tools, however it is quite difficult to find references in Google. This video is very very insightful video for Grafana Loki. However, there is one thing that is not working from our Grafana (v10.0.1). If we change to instant type, then all different values in a pie chart will be aggregated so it will display one value only. This issue doesn't happen in Query type. Have you ever heard about this issue?

    • @condla
      @condla Před rokem

      Hi @albogp thanks for the feedback. I haven't heard about this yet, but you can ask the community.grafana.com or join the community slack and ask your question there: grafana.slack.com

  • @Lars-pi4vx
    @Lars-pi4vx Před rokem

    Great video! I wished there was more. I wonder if there Is any solution to do such ad hoc filters with "regex" or "pattern" parsed logs?

    • @condla
      @condla Před rokem

      Thanks 😊. Yes, you can use the regex/pattern parser to do any kind of ad hoc filtering. Examples are dependent on the expressions and patterns of course. What's your log pattern and what would you like to filter for?

    • @Lars-pi4vx
      @Lars-pi4vx Před rokem

      Hi @@condla , Thanks for your quick reply! This is my loggql for a huge file with more or less unstructured log rows, which shows up the amount of all errors occurred in the selected period: sum by(logMsgMasked) (count_over_time({env=~"$env", job="core-files", filename=~"activities.log"} |~ `(WARNING|ERROR)` | regexp `^\[(?P<datestr>.+) (?P<timestr>.+)\]\[PID:(?P<pid>\d+)\] level\.(?P<loglevel>\S+): (?P<logMsg>.*)` | regexp `((a|A)ccount #?(?P<accountId>\d+))` | label_format logMsgMasked="{{regexReplaceAll \"(\\\\d+)\" .logMsg \"\\\\d+\"}}" | line_format "{{.logMsgMasked}}" [$__range])) Suppose there was a log message in the "logMsg" pattern match section: "Memory for account 4711 exhausted by 123456 bytes.", this will be converted to "Memory for account \d+ exhausted by \d+ bytes.". So the converted message should be in an ad hoc filter panel. Activating the adhoc filter on it should display all messages in a corresponding raw message panel below of it, regardless of the number of bytes or the account where the error occurred. I hope I have been able to describe my problem clearly enough.

  • @utpxxx
    @utpxxx Před rokem

    will this work with the json parser for the aggregation if they are not labels in loki already?

    • @condla
      @condla Před rokem

      That's correct, given your log line is in json format, you can parse them on-read (at query time) with Loki's json parser.

  • @maxlagus9042
    @maxlagus9042 Před rokem

    Literally the only guide that actually shows how to do stuff! Like from me and my team:)

    • @condla
      @condla Před rokem

      Thanks for the nice words to you and your team ❤

  • @bhagyashrighuge4170

    Can we use it for Json data

    • @condla
      @condla Před rokem

      Yes of course, you would just use the json parser instead of the logfmt one

  • @simonshkilevich3032

    like seriously, god bless you.

    • @condla
      @condla Před rokem

      Thank you, sounds like this solved an issue for you 😊

  • @wladimirdelacruz
    @wladimirdelacruz Před rokem

    Gracias Stefan!!!! 😀

    • @condla
      @condla Před rokem

      Thanks for the feedback

  • @tobiashelbing1233
    @tobiashelbing1233 Před rokem

    Vielen Dank!

    • @condla
      @condla Před rokem

      Haha, gern geschehen 😊

  • @Babe_Chinwendum
    @Babe_Chinwendum Před rokem

    Thank you so much. This was really helpful!

    • @condla
      @condla Před rokem

      Thanks for the feedback. I wrote a blog post that accompanies the video, released yesterday: grafana.com/blog/2023/05/18/6-easy-ways-to-improve-your-log-dashboards-with-grafana-and-grafana-loki/

    • @Babe_Chinwendum
      @Babe_Chinwendum Před rokem

      @@condla Thank you so much I was able to complete a task thanks to this, my logs were in JSON format, logfmt was not parsing and I guess ad-hoc variables would not work in that case

  • @kk77781
    @kk77781 Před rokem

    Thanks.. did u tried any drill down within the dashbaord that's is from one panel to another

    • @condla
      @condla Před rokem

      You can also do that with data links by just specifying the same dashboard. You can see an example of data links linking to the same dashboard in this video section "data links" czcams.com/video/EPLvB1eVJJk/video.html

  • @kk77781
    @kk77781 Před rokem

    Could u please elaborate more..how did u build this drop down

    • @condla
      @condla Před rokem

      Have a look at data links: grafana.com/docs/grafana/latest/panels-visualizations/configure-data-links/

  • @juandavidcorrea5501

    awesome ! please could you share us the repository ?

    • @condla
      @condla Před rokem

      Check this page, where you get further info and a link to the Faro repo grafana.com/oss/faro/

  • @nemeth-io
    @nemeth-io Před rokem

    Very insightful! Thank you, Stefan.

    • @condla
      @condla Před rokem

      Thanks Andreas!

  • @alainphm
    @alainphm Před rokem

    Awesome stuff !

  • @olegbrigmann378
    @olegbrigmann378 Před rokem

    Stark.. Danke sehr. Nifi hat sich bisl geändert aber mit deiner Anleitung war auch das kein Problem. Ohne die...phuu

    • @condla
      @condla Před rokem

      Gerne und freut mich. Ist schon etwas älter das Video 😊

  • @rezahze4641
    @rezahze4641 Před 2 lety

    @Stefan Dunkler: Can you please show how and which version of driver should be installed?

  • @ankushsingh9568
    @ankushsingh9568 Před 2 lety

    Thanks for sharing the knowledge... its really good

  • @alex_pike
    @alex_pike Před 2 lety

    Thanks!

  • @zeinabmeftah
    @zeinabmeftah Před 2 lety

    Excellent!

  • @ArmenSanoyan
    @ArmenSanoyan Před 2 lety

    Hi Stefan and thx for video. How can I know if data was inserted or deleted? Because in both cases I get just object of data

  • @chitrangsharma
    @chitrangsharma Před 2 lety

    Hi sir can we put the resultant data into another database! for example can I put that data into some other sql database table ? basically my requirement is to make replica of database.

    • @condla
      @condla Před 2 lety

      Sure, drag another processor into the board that has a name like "Put....." e.g. PutSQL to write it to a system that talks SQL. You need to add the jdbc driver just like you add it when you query it. There's plenty of other Put processors that use the Write APIs of many other databases.

    • @chitrangsharma
      @chitrangsharma Před 2 lety

      @@condla thanks a lot ♥️

  • @fariaanzum5913
    @fariaanzum5913 Před 3 lety

    stefan i am unable to connect

  • @raomohsin3330
    @raomohsin3330 Před 3 lety

    Hi Stefan, I am new in Apache NIFI, I am trying to fetch data from sqlite and want to store in csv file, please share tutorial for it?

  • @saisaran5763
    @saisaran5763 Před 3 lety

    Hi Stefan, Can we lookup for multiple fields in pokemon API ( using same lookup processor )

  • @pdzingade
    @pdzingade Před 3 lety

    Hi Stefen , thank you so much for this video.. all the information shared is just great. If you could find some time to upload more videos on NIFI it will be really great !!

    • @condla
      @condla Před 3 lety

      Thanks so much for the feedback. Highly appreciate it. What would you be interested in specifically?

    • @pdzingade
      @pdzingade Před 3 lety

      @@condla If you could share us some knowledge on building our own customised processors , It will be wonderful

  • @nelsonkatale398
    @nelsonkatale398 Před 3 lety

    How can i fix this error Cannot create JDBC driver of class 'com.mysql.jdbc.Driver' for connect URL 'jdbc:mysql//mysql-rfam-public.ebi.ac.uk:4497/Rfam':

  • @minhtampham7232
    @minhtampham7232 Před 3 lety

    Thanks so much!!! The best video.

    • @condla
      @condla Před 3 lety

      You're welcome

  • @hamidmushtaq7611
    @hamidmushtaq7611 Před 4 lety

    I find graphical programming more difficult than text based programming. In the former, I often get confused using the user interface.

    • @condla
      @condla Před 4 lety

      Hi Hamid, I know what you mean. However, I guess you can get as confused with code as with graphical user interfaces if you're not following clean code practices. If you want to get into ways on how to "write" clean NiFi flows, you can watch this video: czcams.com/video/v1CoQk730qs/video.html&ab_channel=NiFiNotes

  • @TheGoodDalek
    @TheGoodDalek Před 4 lety

    Excellent video, thanks! A question: max value property lets us get new records, but what if we wanted to get updated records as well? Say we have a 'last updated' column with a timestamp to facilitate this. Should we then build another pipeline for updates?

    • @condla
      @condla Před 4 lety

      You can add multiple columns to check for both new and updated records based on the timestamp in those columns.

    • @rezahze4641
      @rezahze4641 Před 2 lety

      @@condla Can you please show how and which version of driver should be installed?

  • @joaquincasanovacordova7746

    Excellent!

  • @rajeshluckky2751
    @rajeshluckky2751 Před 4 lety

    Nice.. I want to connect to postgresql db which is secured by ssl. Since there is no property on DBCPConnection Pool controller to provide ssl deatils, how we are able to connect to db ? Can we pass sslcert and sslmode in query param of database url property ?

    • @condla
      @condla Před 4 lety

      In the answer of the following thread is a link that should help you on this issue. Let me know how it goes. community.cloudera.com/t5/Support-Questions/QueryDatabaseTable-giving-error-while-pulling-data-from/td-p/220949

    • @rajeshluckky2751
      @rajeshluckky2751 Před 4 lety

      @@condla Thanks for info. i tried by passing ssl details in query params for 'database connection url' property (jdbc:postgresql://host:port/databaseName?ssl=true&sslrootcert=C:\Certificate\DEV.cert&sslmode=verify-full ). But getting following exception .. 1. Caused by: org.postgresql.util.PSQLException: SSL error: Received fatal alert: handshake_failure .