![ViTechTalks](/img/default-banner.jpg)
- 280
- 108 282
ViTechTalks
India
Registrace 18. 12. 2021
Welcome to ViTechTalks, we are passionate about exploring the latest in technology, and helping you navigate the ever-evolving digital landscape. Our channel is designed to inspire, educate, and keep you at the forefront of innovation.
What You'll Find:
Tech Tutorials: Learn how to master on MuleSoft with step-by-step.
Interviews and Discussions: Engage with industry experts and interviews & discussions.
Tech Challenges: Join us in fun and informative tech challenges to put your knowledge to the test.
Why Subscribe?
🌐 Stay Informed: Receive regular updates on the hottest tech releases and trends.
🛠️ Learn and Explore: Enhance your tech skills with our educational content.
1) Mule Soft
2) Progress 4GL
3) SQL
4) Java-Spring Boot & Micro Services
5) Python
🔗 Connect with ViTechTalks:
Instagram: v_i_tech
Facebook: profile.php?id=61554543787357
Thanks for being a part of the VITechTalks family. Let's tech it to the next level! 🚀💻🔧
What You'll Find:
Tech Tutorials: Learn how to master on MuleSoft with step-by-step.
Interviews and Discussions: Engage with industry experts and interviews & discussions.
Tech Challenges: Join us in fun and informative tech challenges to put your knowledge to the test.
Why Subscribe?
🌐 Stay Informed: Receive regular updates on the hottest tech releases and trends.
🛠️ Learn and Explore: Enhance your tech skills with our educational content.
1) Mule Soft
2) Progress 4GL
3) SQL
4) Java-Spring Boot & Micro Services
5) Python
🔗 Connect with ViTechTalks:
Instagram: v_i_tech
Facebook: profile.php?id=61554543787357
Thanks for being a part of the VITechTalks family. Let's tech it to the next level! 🚀💻🔧
Snowflake Session | Day-1 | What is Database & Schema | @ViVisionTechnologies | Snowflake | Tables
Hi Welcome to ViTech Talks , Here we are going to talk about
What is Database and importance of schema
what is sql and how to create tables schema
Course details for call us- 9972935359
Join this channel to get access to perks:
czcams.com/channels/OCmIxlnxT4NICdOR1E36vQ.htmljoin
Git hub link for practice - github.com/VITechTalks
--------------------------
Database
Schema
SQL
What is the road map for SQL
Tables & insert / bulk insert /load
Database : It is a storage location , we can store data ( structured , semi structured , un structured data)
It is a collection of data
create user -- api -- database
instagram -- fetch -- data base
xyz organization -- employee / sales ...etc
EX: Create database database name Ecomm /bank /
Schema : Logical structure of the data
BANK
Hdfc -- customer , employee , loan
Icic - customer , employee , loan
with in same databse we can't create duplicate tables
first we need to create schema then you can create tables
ex: create schema sbi
SQL : Structured query language -- sql is not a database
In order to communicate with any database we are going to SQL
C -- Create database /schema /cretae table /insert data
R -- Read fetch
U -- update
D -- Delete
Data manipulations
Its like common english language
Databses or RDBMS
Mysql
oracle
postgrage
sybase
mangodb
Cosmos
snowflake
whats app group
sunday EOD
create database bank;
//ctrl + enter or
create table bank.hdfc.employee (empid number,empname string);
create table bank.sbi.employee (empid number,empname string,salary int);
create schema sbi ;
create schema hdfc ;
show tables;
use database vitech;
use database bank;
Loan_ID loan_status Principal terms effective_date due_date paid_off_time past_due_days age education Gender
create table loan_payments (Loan_ID string,
loan_status string,
Principal string,
terms int,
effective_date string,
due_date string,
paid_off_time string,
past_due_days string,
age int,
education string,
Gender string
)
desc table loan_payments;
insert into loan_payments (Loan_ID ,loan_status ,Principal ,terms ,effective_date ,due_date ,paid_off_time ,past_due_days ,age ,education ,Gender
) values ('xqd20166231', 'PAIDOFF', '1000' ,30 ,'09-08-2016', '10-07-2016 ','9/14/2016 19:31 ','09-08-2016',45 ,'High School Below ','male'
)
insert into loan_payments (Loan_ID ,loan_status ,Principal ,terms ,effective_date ,due_date ,paid_off_time ,past_due_days ,age ,education ,Gender
) values ('xqd20166231', 'PAIDOFF', '1000' ,30 ,'09-08-2016', '10-07-2016 ','9/14/2016 19:31 ','09-08-2016',45 ,'High School Below ','male'
)
insert into loan_payments (Loan_ID ,loan_status ,Principal ,terms ,effective_date ,due_date ,paid_off_time ,past_due_days ,age ,education ,Gender
) values ('xqd20166231', 'PAIDOFF', '1000' ,30 ,'09-08-2016', '10-07-2016 ','9/14/2016 19:31 ','09-08-2016',45 ,'High School Below ','male'
)
select * from loan_payments;
delete from loan_payments;
drop table loan_payments; // it will delete table data as well as table structure
undrop table loan_payments;
#SQL
#vivisiontechnologies
#snowflake
#snowflaketraining
#sqlzerotohero
#database
#snowflakeintegration
#sqlroadmap
#vivisiontech
#sqltutorial
#sqlforbeginners
#snowflakes
Snowflake jobs
Snowflake Openings
snowflake Tutorial
Snowflake Training
#SQL
#vivisiontechnologies
#snowflake
#snowflaketraining
#sqlzerotohero
#database
#snowflakeintegration
#sqlroadmap
#vivisiontech
#sqltutorial
#sqlforbeginners
#snowflakes
#snowflake
#snowflakeonlinetraining
#ADF #DBT #AZuredatafactory
#softwarejoborientedtraining
#DBTCloud #Oracledatabase
#Git #AWS #Snowflake
#vivisiontechnologies
#SnowflakeScenariobasedInterviewQuestion
#snowflakedatawarehouse #snowflaketutorial
#snowflakedatabase #snowflaketutorial #snowflakeonline #adf #dbt
#Snowflake #WhatIsSnowflake #WhatIsSnowflakeDatawareHouse #SnowflakeTutorial #WhatIsSnowflakeDatabase #SnowflakeArchitecture #SnowflakeFeatures
What is Database and importance of schema
what is sql and how to create tables schema
Course details for call us- 9972935359
Join this channel to get access to perks:
czcams.com/channels/OCmIxlnxT4NICdOR1E36vQ.htmljoin
Git hub link for practice - github.com/VITechTalks
--------------------------
Database
Schema
SQL
What is the road map for SQL
Tables & insert / bulk insert /load
Database : It is a storage location , we can store data ( structured , semi structured , un structured data)
It is a collection of data
create user -- api -- database
instagram -- fetch -- data base
xyz organization -- employee / sales ...etc
EX: Create database database name Ecomm /bank /
Schema : Logical structure of the data
BANK
Hdfc -- customer , employee , loan
Icic - customer , employee , loan
with in same databse we can't create duplicate tables
first we need to create schema then you can create tables
ex: create schema sbi
SQL : Structured query language -- sql is not a database
In order to communicate with any database we are going to SQL
C -- Create database /schema /cretae table /insert data
R -- Read fetch
U -- update
D -- Delete
Data manipulations
Its like common english language
Databses or RDBMS
Mysql
oracle
postgrage
sybase
mangodb
Cosmos
snowflake
whats app group
sunday EOD
create database bank;
//ctrl + enter or
create table bank.hdfc.employee (empid number,empname string);
create table bank.sbi.employee (empid number,empname string,salary int);
create schema sbi ;
create schema hdfc ;
show tables;
use database vitech;
use database bank;
Loan_ID loan_status Principal terms effective_date due_date paid_off_time past_due_days age education Gender
create table loan_payments (Loan_ID string,
loan_status string,
Principal string,
terms int,
effective_date string,
due_date string,
paid_off_time string,
past_due_days string,
age int,
education string,
Gender string
)
desc table loan_payments;
insert into loan_payments (Loan_ID ,loan_status ,Principal ,terms ,effective_date ,due_date ,paid_off_time ,past_due_days ,age ,education ,Gender
) values ('xqd20166231', 'PAIDOFF', '1000' ,30 ,'09-08-2016', '10-07-2016 ','9/14/2016 19:31 ','09-08-2016',45 ,'High School Below ','male'
)
insert into loan_payments (Loan_ID ,loan_status ,Principal ,terms ,effective_date ,due_date ,paid_off_time ,past_due_days ,age ,education ,Gender
) values ('xqd20166231', 'PAIDOFF', '1000' ,30 ,'09-08-2016', '10-07-2016 ','9/14/2016 19:31 ','09-08-2016',45 ,'High School Below ','male'
)
insert into loan_payments (Loan_ID ,loan_status ,Principal ,terms ,effective_date ,due_date ,paid_off_time ,past_due_days ,age ,education ,Gender
) values ('xqd20166231', 'PAIDOFF', '1000' ,30 ,'09-08-2016', '10-07-2016 ','9/14/2016 19:31 ','09-08-2016',45 ,'High School Below ','male'
)
select * from loan_payments;
delete from loan_payments;
drop table loan_payments; // it will delete table data as well as table structure
undrop table loan_payments;
#SQL
#vivisiontechnologies
#snowflake
#snowflaketraining
#sqlzerotohero
#database
#snowflakeintegration
#sqlroadmap
#vivisiontech
#sqltutorial
#sqlforbeginners
#snowflakes
Snowflake jobs
Snowflake Openings
snowflake Tutorial
Snowflake Training
#SQL
#vivisiontechnologies
#snowflake
#snowflaketraining
#sqlzerotohero
#database
#snowflakeintegration
#sqlroadmap
#vivisiontech
#sqltutorial
#sqlforbeginners
#snowflakes
#snowflake
#snowflakeonlinetraining
#ADF #DBT #AZuredatafactory
#softwarejoborientedtraining
#DBTCloud #Oracledatabase
#Git #AWS #Snowflake
#vivisiontechnologies
#SnowflakeScenariobasedInterviewQuestion
#snowflakedatawarehouse #snowflaketutorial
#snowflakedatabase #snowflaketutorial #snowflakeonline #adf #dbt
#Snowflake #WhatIsSnowflake #WhatIsSnowflakeDatawareHouse #SnowflakeTutorial #WhatIsSnowflakeDatabase #SnowflakeArchitecture #SnowflakeFeatures
zhlédnutí: 112
Video
Snowflake Training Batch July-22 | @vitechtalks6017 | Snowflake Demo Session | Course Overview
zhlédnutí 158Před 14 hodinami
Hi Welcome to ViTech Talks , Here we are going to talk about What is snowflake and features Create Snowflake Account - signup.snowflake.com/ Course details you can contact - 9972935359 @ViVisionTechnologies @vitechtalks6017 Join this channel to get access to perks: czcams.com/channels/OCmIxlnxT4NICdOR1E36vQ.htmljoin #SQL #vivisiontechnologies #snowflake #snowflaketraining #sqlzerotohero #databa...
MuleSoft Training| Create API in Simple Steps | @vitechtalks6017 | June-28-2024 7-AM Batch
zhlédnutí 222Před 28 dny
Hi Welcome To ViTech Talks , In this video you can learn about below concepts and demo for the session What is API? and How it works How to create sample project Create api for person eligible for vote or not Conenct with MySQL database using MuleSoft Anypoint studio for any training please send an email - vitschool21@gmail.com #vitechtalks @vitechtalks6017 @ViVisionTechnologies #mule #mulesoft...
Snowflake Session | Day-2 | What is Database & Schema | @ViVisionTechnologies | Snowflake | Tables
zhlédnutí 129Před 28 dny
Hi Welcome to ViTech Talks , Here we are going to talk about What is Database and importance of schema what is sql and how to create tables schema Course details for call us- 9972935359 Join this channel to get access to perks: czcams.com/channels/OCmIxlnxT4NICdOR1E36vQ.htmljoin Git hub link for practice - github.com/VITechTalks Snowflake Realtime Training Session-2 Database : Its a collection ...
MuleSoft Training-Demo | What is MuleSoft | @vitechtalks6017 | June-27-2024 7-AM Batch
zhlédnutí 178Před 28 dny
Hi Welcome To ViTech Talks , In this video you can learn about below concepts and demo for the session What is API? and How it works What is integration ? What is middleware without middle ware What is mulesoft Phases of Mulesoft Anypoint studio for any training please send an email - vitschool21@gmail.com #vitechtalks @vitechtalks6017 @ViVisionTechnologies #mule #mulesoft #mulesoftTraining #an...
Snowflake Demo | Day-1 | What is Snowflake & Jobs| @vitechtalks6017 | Course Overview |Snow Training
zhlédnutí 259Před měsícem
Hi Welcome to ViTech Talks , Here we are going to talk about What is snowflake and features Create Snowflake Account - signup.snowflake.com/?trial=student @ViVisionTechnologies @vitechtalks6017 Join this channel to get access to perks: czcams.com/channels/OCmIxlnxT4NICdOR1E36vQ.htmljoin #SQL #vivisiontechnologies #snowflake #snowflaketraining #sqlzerotohero #database #snowflakeintegration #sqlr...
SQL Zero To Hero | Session-2.3 | Setup Snowflake With VS Code | @vitechtalks6017 | Download VS
zhlédnutí 51Před měsícem
Hi Welcome to ViTech Talks , In this video you can learn how to download and VS code . Setup Snowflake Account With VS code Practice SQLs @vitechtalks6017 @ViVisionTechnologies Snowflake account creation- signup.snowflake.com/?trial=student VS code download Link - code.visualstudio.com/download Direct Download VS code - code.visualstudio.com/docs/?dv=win64user Join this channel to get access to...
SQL Zero To Hero | Session-2.2 | Snowflake 120 days free trial Account | @vitechtalks6017
zhlédnutí 97Před měsícem
Hi Welcome to ViTech Talks , In this video you can learn how to create snowflake 120 days free trial account Snowflake account creation- signup.snowflake.com/?trial=student Join this channel to get access to perks: czcams.com/channels/OCmIxlnxT4NICdOR1E36vQ.htmljoin @vitechtalks6017 #sql #sqlzerotohero #vivision #vivisiontechnologies #snowflake #vivision technologies #snowflaketraining #sqlzero...
SQL Zero To Hero | Session-2.1 | Download & Install MYSQL Workbench | #vivisiontechnologies | SQL
zhlédnutí 97Před měsícem
Hi Welcome to ViTech Talks , In this video you can learn how to download and install MYSQL software and working with MySQL workbench @ViVisionTechnologies @vitechtalks6017 Join this channel to get access to perks: czcams.com/channels/OCmIxlnxT4NICdOR1E36vQ.htmljoin #sql #sqlzerotohero #vivision #vivisiontechnologies #snowflake #vivision technologies #snowflaketraining #sqlzerotohero #database #...
SQL Zero To Hero | Session-1 | What is SQL and importance | @ViVisionTechnologies |What is Database
zhlédnutí 278Před měsícem
Hi Welcome to ViTech Talks , In this video we have discussed about the below concepts. What is SQL What is Database Importance of SQL Sub languages of SQL DDL DML DQL DCL TCL Join this channel to get access to perks: czcams.com/channels/OCmIxlnxT4NICdOR1E36vQ.htmljoin @ViVisionTechnologies @vitechtalks6017 #sql #sqlzerotohero #vivision #vivisiontechnologies #snowflake #vivision technologies #sn...
Snowflake Session | Day-2 | What is Database & Schema | @ViVisionTechnologies | Snowflake
zhlédnutí 79Před měsícem
Hi Welcome to ViTech Talks , Here we are going to talk about What is Database and importance of schema what is sql and how to create tables schema Course details for call us- 9972935359 Join this channel to get access to perks: czcams.com/channels/OCmIxlnxT4NICdOR1E36vQ.htmljoin SQL for practice create database vivision; show databases; create table customer (cid int,customerName string); creat...
Snowflake Demo Session | Day-1 | What is Snowflake & Why| Course Overview | @ViVisionTechnologies
zhlédnutí 82Před měsícem
Hi Welcome to ViTech Talks , Here we are going to talk about What is snowflake and features Create Snowflake Account - signup.snowflake.com/?trial=student @ViVisionTechnologies @vitechtalks6017 Join this channel to get access to perks: czcams.com/channels/OCmIxlnxT4NICdOR1E36vQ.htmljoin #SQL #vivisiontechnologies #snowflake #snowflaketraining #sqlzerotohero #database #snowflakeintegration #sqlr...
SQL Zero To Hero | Introduction Overview Of the course | @ViVisionTechnologies | Topics To be cover
zhlédnutí 104Před měsícem
Hi Welcome to ViTech Talks, In this video you we have discussed about the below concepts. How to become master in SQL Road map of the SQL developer Course Syllabus Join this channel to get access to perks: czcams.com/channels/OCmIxlnxT4NICdOR1E36vQ.htmljoin @ViVisionTechnologies @vitechtalks6017 #SQL #vivisiontechnologies #snowflake #snowflaketraining #sqlzerotohero #database #vitechtalks #snow...
ViVision | Course and Training Centre Details for Snowflake & MuleSoft | @ViVisionTechnologies
zhlédnutí 170Před měsícem
ViVision | Course and Training Center Details for MuleSoft & Snowflake Subscribe Our New Channel : www.youtube.com/@ViVisionTechnologies @vitechtalks6017 @ViVisionTechnologies Join this channel to get access to perks: czcams.com/channels/OCmIxlnxT4NICdOR1E36vQ.htmljoin #snowflake #snowflakeintegration #snowfall #snowpipe #snowsql @vitechtalks6017 #vitechtalks #snowflake #mulesoft #mulesofttrain...
ViTech Talks | Develop Calculator API | #vitechtalks | Anypoint Code Builder | Arithmetic Operations
zhlédnutí 163Před 2 měsíci
ViTech Talks | Develop Calculator API | #vitechtalks | Anypoint Code Builder | Arithmetic Operations
ViTech Talks | Import RAML & Implement API | #vitechtalks | ABC Cloud Setup & Implementation
zhlédnutí 93Před 2 měsíci
ViTech Talks | Import RAML & Implement API | #vitechtalks | ABC Cloud Setup & Implementation
ViTech Talks | How To Setup AnyPoint Code Builder With Cloud IDE | Design Sample RAML & Publish
zhlédnutí 46Před 2 měsíci
ViTech Talks | How To Setup AnyPoint Code Builder With Cloud IDE | Design Sample RAML & Publish
ViTech Talks | Design RAML & Publish Into Exchange Using Anypoint Code Builder | MuleSoft
zhlédnutí 70Před 2 měsíci
ViTech Talks | Design RAML & Publish Into Exchange Using Anypoint Code Builder | MuleSoft
ViTech Talks | How To Setup AnyPoint Code Builder | Install VS Code | Design sample RAML
zhlédnutí 137Před 2 měsíci
ViTech Talks | How To Setup AnyPoint Code Builder | Install VS Code | Design sample RAML
ETL Process With MuleSoft | Read Data from S3 and Load Into Snowflake & MySQL | ViTech Talks
zhlédnutí 233Před 2 měsíci
ETL Process With MuleSoft | Read Data from S3 and Load Into Snowflake & MySQL | ViTech Talks
ETL Process With MuleSoft | Read data from S3 bucket and Insert into AWS MY SQL | @vitechtalks6017
zhlédnutí 119Před 2 měsíci
ETL Process With MuleSoft | Read data from S3 bucket and Insert into AWS MY SQL | @vitechtalks6017
AWS MY SQL Perform Data base Operations | CRUD Operations With MySQL in AWS | ViTechTalks | AWS SQL
zhlédnutí 161Před 2 měsíci
AWS MY SQL Perform Data base Operations | CRUD Operations With MySQL in AWS | ViTechTalks | AWS SQL
Create My SQL Data Base in AWS Setup With Db Visualizer | AWS | @vitechtalks6017 | Create Database
zhlédnutí 166Před 2 měsíci
Create My SQL Data Base in AWS Setup With Db Visualizer | AWS | @vitechtalks6017 | Create Database
Create Naukri Profile for MuleSoft Developer | @vitechtalks6017 | Naukri.com | Create Account
zhlédnutí 249Před 2 měsíci
Create Naukri Profile for MuleSoft Developer | @vitechtalks6017 | Naukri.com | Create Account
Automate Process Read Data from S3 Bucket & Insert into Snowflake | @vitechtalks6017 | Realtime Mule
zhlédnutí 108Před 2 měsíci
Automate Process Read Data from S3 Bucket & Insert into Snowflake | @vitechtalks6017 | Realtime Mule
Read Data from S3 Bucket & Insert into Snowflake | @vitechtalks6017 | MuleSoft With AWS Integration
zhlédnutí 250Před 3 měsíci
Read Data from S3 Bucket & Insert into Snowflake | @vitechtalks6017 | MuleSoft With AWS Integration
Read Data from S3 Bucket | MuleSoft With AWS Integration | @vitechtalks6017 | MuleSoft Integration
zhlédnutí 129Před 3 měsíci
Read Data from S3 Bucket | MuleSoft With AWS Integration | @vitechtalks6017 | MuleSoft Integration
Load Data Into S3 Bucket | MuleSoft With AWS Integration | @vitechtalks6017 | MuleSoft Integration
zhlédnutí 219Před 3 měsíci
Load Data Into S3 Bucket | MuleSoft With AWS Integration | @vitechtalks6017 | MuleSoft Integration
MuleSoft Mock Interview | @vitechtalks6017 | MuleSoft Developer Interview
zhlédnutí 1,9KPřed 3 měsíci
MuleSoft Mock Interview | @vitechtalks6017 | MuleSoft Developer Interview
Create S3 Bucket | MuleSoft With AWS Integration | @vitechtalks6017 | MuleSoft Integration
zhlédnutí 197Před 3 měsíci
Create S3 Bucket | MuleSoft With AWS Integration | @vitechtalks6017 | MuleSoft Integration
We should use connection pooling, to avoid creating multiple connections to the DB. Weather its parallel for each or batch or for-each, pooling is a much needed thing when dealing with large chunk of data.
Hi , Yes you are correct but in this playlist just we have explained how many ways we can do ..
Thanks
Please upload advanced mulesoft realtime project and level 2 exam video preparation
Nice Explanation on each step.. Beneficial project.
Thank you so much for your feedback
Excellent 👌 explanation! please help How to download AWS Security Hub csv file containing 21+ records ...for example I'm trying to download a Security Hub csv file containing 21+ records. Are there any ways to realize that? By default, the number of max records is 20. If I want to check 200 records for a certain vulnerability (e.g. s3.x), I have to do that 20 times! Bothering to me. ..pls help
Thanks man
Very Good explanation , Thank you so Much ❤❤
Hi , I'm encountering an error during the deploy stage of the CI/CD pipeline. Can you help troubleshoot this issue? Error : [ERROR] Failed to execute goal org.mule.tools.maven:mule-maven-plugin:4.0.0:deploy (default-deploy) on project create-account-api: Execution default-deploy of goal org.mule.tools.maven:mule-maven-plugin:4.0.0:deploy failed: 401 Unauthorized: -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
Hi , 401 unauthorised can you please double check user name and password and how you are passing??
@@vitechtalks6017 Now I am getting the different error . Please find the below error . I will share the Pom.xml and azure-pipeline.xml script as well . Could you please help on this . Error : [ERROR] Failed to execute goal org.mule.tools.maven:mule-maven-plugin:4.0.0:deploy (default-deploy) on project create-account-api: Execution default-deploy of goal org.mule.tools.maven:mule-maven-plugin:4.0.0:deploy failed: Couldn't find environmentName named [dev] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException . pom.xml : <?xml version="1.0" encoding="UTF-8"?> <project xmlns="maven.apache.org/POM/4.0.0" xmlns:xsi="www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="maven.apache.org/POM/4.0.0 maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.mycompany</groupId> <artifactId>create-account-api</artifactId> <version>1.0.0-SNAPSHOT</version> <packaging>mule-application</packaging> <name>create-account-api</name> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding> <app.runtime>4.5.3</app.runtime> <mule.maven.plugin.version>4.0.0</mule.maven.plugin.version> </properties> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-clean-plugin</artifactId> <version>3.2.0</version> </plugin> <plugin> <groupId>org.mule.tools.maven</groupId> <artifactId>mule-maven-plugin</artifactId> <version>${mule.maven.plugin.version}</version> <extensions>true</extensions> <configuration> <cloudHubDeployment> <uri>anypoint.mulesoft.com</uri> <muleVersion>4.4.0</muleVersion> <username>nikhilteja</username> <password>Nikhil@143</password> <applicationName>${project.artifactId}</applicationName> <environment>dev</environment> <region>us-east-2</region> <workers>1</workers> <workerType>MICRO</workerType> <objectStoreV2>true</objectStoreV2> <properties> <http.port>${http.port}</http.port> <mule.env>dev</mule.env> </properties> </cloudHubDeployment> <sharedLibraries> <sharedLibrary> <groupId>net.snowflake</groupId> <artifactId>snowflake-jdbc</artifactId> </sharedLibrary> </sharedLibraries> <classifier>mule-application</classifier> </configuration> </plugin> </plugins> </build> <dependencies> <dependency> <groupId>org.mule.connectors</groupId> <artifactId>mule-http-connector</artifactId> <version>1.9.0</version> <classifier>mule-plugin</classifier> </dependency> <dependency> <groupId>org.mule.connectors</groupId> <artifactId>mule-sockets-connector</artifactId> <version>1.2.4</version> <classifier>mule-plugin</classifier> </dependency> <dependency> <groupId>org.mule.modules</groupId> <artifactId>mule-apikit-module</artifactId> <version>1.9.2</version> <classifier>mule-plugin</classifier> </dependency> <dependency> <groupId>org.mule.examples</groupId> <artifactId>mule-plugin-rcg-snowflake-sys-api-spec</artifactId> <version>2.5.0</version> <classifier>mule-plugin</classifier> </dependency> <dependency> <groupId>net.snowflake</groupId> <artifactId>snowflake-jdbc</artifactId> <version>3.14.4</version> </dependency> <dependency> <groupId>com.mulesoft.connectors</groupId> <artifactId>mule4-snowflake-connector</artifactId> <version>1.1.3</version> <classifier>mule-plugin</classifier> </dependency> <dependency> <groupId>com.mulesoft.modules</groupId> <artifactId>mule-secure-configuration-property-module</artifactId> <version>1.2.7</version> <classifier>mule-plugin</classifier> </dependency> <dependency> <groupId>org.mule.connectors</groupId> <artifactId>mule-email-connector</artifactId> <version>1.7.2</version> <classifier>mule-plugin</classifier> </dependency> </dependencies> <repositories> <repository> <id>anypoint-exchange-v3</id> <name>Anypoint Exchange</name> <url>maven.anypoint.mulesoft.com/api/v3/maven</url> <layout>default</layout> </repository> <repository> <id>mulesoft-releases</id> <name>MuleSoft Releases Repository</name> <url>repository.mulesoft.org/releases/</url> <layout>default</layout> </repository> </repositories> <pluginRepositories> <pluginRepository> <id>mulesoft-releases</id> <name>MuleSoft Releases Repository</name> <layout>default</layout> <url>repository.mulesoft.org/releases/</url> <snapshots> <enabled>false</enabled> </snapshots> </pluginRepository> </pluginRepositories> </project> azure-pipeline.yml : # Maven # Build your Java project and run tests with Apache Maven. # Add steps that analyze code, save build artifacts, deploy, and more: # docs.microsoft.com/azure/devops/pipelines/languages/java trigger: - feature pool: vmImage: ubuntu-latest variables: MAVEN_CACHE_FOLDER: $(Pipeline.Workspace)/.m2/repository MAVEN_OPTS: '-Dmaven.repo.local=$(MAVEN_CACHE_FOLDER)' steps: - task: Cache@2 displayName: Cache Maven local repo inputs: key: 'maven | "$(Agent.OS)" | **/pom.xml' restoreKeys: | maven | "$(Agent.OS)" maven path: $(MAVEN_CACHE_FOLDER) - script: mvn install -B -e - task: Maven@3 displayName: Build inputs: options: '-DskipTests' mavenPomFile: 'pom.xml' mavenOptions: '-Xmx3072m $(MAVEN_OPTS)' javaHomeOption: 'JDKVersion' jdkVersionOption: '11.0' jdkArchitectureOption: 'x64' goals: 'package' - task: MavenAuthenticate@0 inputs: artifactsFeeds: 'amarthalurunaveenteja' mavenServiceConnections: 'Mulesoft Enterprise' - task: Maven@3 displayName: Test inputs: mavenPomFile: 'pom.xml' mavenOptions: '-Xmx3072m $(MAVEN_OPTS)' goals: 'test' publishJUnitResults: true testResultsFiles: '**/surefire-reports/TEST-*.xml' testRunTitle: 'Azure Pipeline Test Report' javaHomeOption: 'JDKVersion' mavenVersionOption: 'Default' mavenAuthenticateFeed: false effectivePomSkip: false sonarQubeRunAnalysis: false - task: Maven@3 displayName: Deploy inputs: mavenPomFile: 'pom.xml' mavenOptions: '-Xmx3072m $(MAVEN_OPTS)' goals: 'deploy' options: '-DskipTests -DmuleDeploy -Danypointusername=$(ANYPOINT_USER) -Danypointpassword=$(ANYPOINT_PWD)' publishJUnitResults: false javaHomeOption: 'JDKVersion' mavenVersionOption: 'Default' mavenAuthenticateFeed: false effectivePomSkip: false sonarQubeRunAnalysis: false
hello,how to use cisco anyconnect on mac m3? it reminded me that I need security software。
Hi Mac also it is going to work but if your using for any office premises..need to connect with IT team to get it installed
@@vitechtalks6017 ok.thx👍
Thank you man you genius
Very useful. Please upload more project videos
Thanks for your feedback , Sure will do
if we not apply security policies, Need to add Environment client id and client secret under properties ?
Hi ..No if you want to apply security you need to add properties then auto discovery then only it will work
Hi Banrakas 😀
Please post few more recently asked interview questions if any pls
Hi , Sure will do
Have u posted recently asked questions in ur paid videos
Hello, Is there still anyway to get free MCD level 1 certification post migration to salesforce trailhead format?
Hi, As of now there is no option
Working
Hi sir, Can you please provide the requirement of gathering documents for the Atm transaction project?
Hi , Can you please check the video description i have given all the links. Below is the sample one - github.com/vitschool92/Mule-Realtime-Project-Sessions/tree/class_notes
It was very crystal clear explanation about transction
Thank you 😊
Nice explanation
Thank u
Hi sir, when will start new class for mulesoft ?????
Hi , Tomorrow new batch is going start please reach me on - 9972935359
Sir where are you from 2018, right now you will be on 1million+ subscriber, so humble and easy teaching so Nice Sir keep going..
Hi Thanks for your words 🤩
how to import data from mysql to snowflake
Can you please watch the entire playlist you may get some idea. czcams.com/video/Dx3-v-39hO0/video.html
@@vitechtalks6017 sir literally im stucked, i have export file from MYSQL into my system with dump and that dump file i need to load into my Snowflake account database plz share any video link on that.
Refer the below video from 40:00 exact option we can see it in 47:05 from ... czcams.com/video/Yn4hQlF1VF0/video.htmlsi=F1RwwlWFO5psP0KK
What can i do if I forget my password in my sql workbench
Hi , We can reset there is a process to change password else need to uninstall and re-install the same. Better password you can keep it same as user name - root . This way we can easily remember .
100% Recomendado
thanks for uploading ----- Sir could you share the Document
Sure will upload soon ..
Email Is Not Going. "Error while sending email: 530-5.7.0 Must issue a STARTTLS command first. This error is coming.
yes can u please check weather you have enabled startls as true ?
thanks of updating
I think best way is sort both files thru external sort jcl, same way as you explained and instead of if better use Evaluate
Después de buscar en varias paginas, gracias a ud logré instalar Cisco, gracias 😊
When will more videos uploaded
This week end will start actual uploads
kindly do zero to advance videos ,or any personal training you are providing please let me know
Yes we are providing personal training this one is demo session
informative session
tq
Thank you, make the video alternate days.... please
sure
Great work vitechtalks please make more integration real time use cases, and please do the phase 3 of bfs project. Many thanks for your content
source side we put on table row connector and select watermark and processing the data
2nd function was absolute but you gave an example of ASC
Thanks for sharin, seems they change the format in the mulesoft training website, and there is no way to get the free voucher?
yes they have changed process here is the new link for certification as of now its not free . We have to pay and attend trainings before going to take exam. trailheadacademy.salesforce.com/products/mulesoft#f-products=Mulesoft?dispatch=show&courseType=certification&id=e75ecbfd-97b3-11ea-9f48-0cc47adeb5f8
i am very impressed by your videos and following all ......... i have a doubt where we clone with Anypoint Platfrom here? without providing any configurations of Anypoint Platform how it's deploying in Runtime Manager?
Hi , configurations available in pom.xml and creating a job in jenkins..So in video description i have given github link form there u can copy all
After adding security group getting same error. Please respond
Can you please tell me what is the error you are getting??
Ping failed : the ping attempt timed out. Please verify that you have specified a correct host(server) name and port number. but I have provided everything in correct
@@AYAAN854 Have u added inbound rules ? if not add custom TCP 11:29 watch and add
After Adding also same issue
Its not available now, not even free video trainings. Anyone having more details?
Yes we do have free video training Please go through below link trailheadacademy.salesforce.com/products/mulesoft#f-products=Mulesoft?dispatch=show&courseType=certification&id=e75ecbfd-97b3-11ea-9f48-0cc47adeb5f8
Good work👏
since we are past may 6th please make a video about how to get it for free , please make a video about it
Sure mean while you can go through below link trailheadacademy.salesforce.com/products/mulesoft#f-products=Mulesoft?dispatch=show&courseType=certification&id=e75ecbfd-97b3-11ea-9f48-0cc47adeb5f8
@@vitechtalks6017 Thanks, the format changed a little, do we go through the DEX401 and finish it, then we can get the voucher?
Where do we set the ACK mode ? I am a bit confused on it
Hi , At the end you need to use ack connector in this videos we are not using you can implement and rise error for some records to test
I am working on similar job, i am stuck at how to update data into target(sf) when there is any update in source system, i want to perform this update task in schedule jobs.
Hi , Similar job means what's your source system ?...if it's AWS you can use on new object the same video is gonna help ..
@@vitechtalks6017 source is sql snowflake , having fieleds for status success and pending now it is going to sf there are two objects created one for success and one for pending, now i want to create a service in realtime where if the status changes in sql from pending to success then it also updates sf objects.
Snowflake ❄️ we have connector on new row update ...it will trigger whenever there is a new row or update ..else you need to use water marking concept ....I have created one playlist mulesoft realtime use cases that may help u
while i am trying top 10 records are working but i want to know when you are using limit and offset query , snowflake input parameters what you maintain that one point is missing in this video....can you tell me the input parameters?
Use the below SQL and pass 2 input query parms like limit and offset %dw 2.0 output application/json --- p('sql.to.be.used') default '' ++ " limit "++ attributes.queryParams.limit default ''++" offset "++ attributes.queryParams.offset
Hi, Could you also make a video on external logging mechanism like splunk. like how they are using it in Real time scenario.
Awesome!!! Thank You so Much. It worked.
Glad it helped!
Great work 👏
Thank u
I had one question as its a schedule based api and data is being fetched as per schedule, suppose in one schedule job you were able to fetch and migrate 100 records but how my api will work on next 100 as in sql we are not deleteing anything so how this mirgration will proceed for next 101 to 200 records.
Yes , Good question ?? That is the reason we are using water marking concept to fetch some set of records every time without duplicates ..check it in solution 4 i have explained
Can I know what is the first question's answer ? is it 'AWS Secrets Manager Properties Provider' ?
If you keep it in AWS secrets every hit you need to pick from AWS right ? ..yes we can use but there is some other way too ... Waiting for people response i will create a video soon for this.