Netwoven Inc
Netwoven Inc
  • 85
  • 171 837
Webinar: Copilot and Generative AI in the Workplace
Want to enhance your current applications with AI and Copilot? Watch our webinar to discover how you can harness the power of AI to increase efficiency with intelligent automation.
zhlédnutí: 254

Video

Govern your data across your entire data estate using Microsoft Purview and Fabric
zhlédnutí 416Před měsícem
Are you struggling to juggle data security, governance, and compliance in today's ever-evolving regulatory landscape? You're not alone. Watch our webinar, "Govern your data across your entire data estate using Microsoft Purview and Fabric," where we'll explore proven strategies to effectively manage your data while ensuring it's secure, reliable, and compliant.
Data Science for Business with Microsoft Fabric
zhlédnutí 144Před 2 měsíci
In today's fast-paced business landscape, leveraging the power of data science has become indispensable for staying competitive and driving growth. This webinar aims to shed light on how harnessing the potential of data science can revolutionize your business strategies and propel your organization towards unprecedented success.​ Learn more netwoven.com/events/webinar-data-science-for-business-...
Protect your organization by staying compliant using Microsoft Purview
zhlédnutí 110Před 2 měsíci
In this webinar, you'll learn: - Regulatory landscape and importance of compliance - Common compliance challenges - Understanding Microsoft Purview for Compliance - Implementing and using Microsoft Purview for compliance
Webinar: Advanced Deep Dive Demos with Microsoft Fabric
zhlédnutí 406Před 3 měsíci
In this webinar, you will learn: - Gain hands-on experience with Microsoft Fabric. - Explore real-world applications and use cases. - Learn best practices and tips for maximizing the benefits of this powerful tool. - Access the Free ebook "Treat Your Data as a Product with Microsoft Fabric" Get the ebook here netwoven.com/services/data-and-ai/data-engineering/data-as-a-product/ebook/
Webinar: Migrate Traditional Data Warehouses to Fabric Modern Warehouse and Serverless SQL
zhlédnutí 308Před 3 měsíci
In this webinar, you'll learn: - How to Modernize Your Data Warehouse - Fabric's Cutting-Edge Capabilities - Serverless SQL for Scalability - Best Practices for Smooth Migration - Real-world Case Studies - Fabric Copilot features
Planning of Microsoft 365 Copilot Rollout Strategy
zhlédnutí 8KPřed 3 měsíci
In this webinar, you'll learn how to: - Get your organization ready for Microsoft Copilot: get your information ready for search - Implement the right information access controls and policies established in your organization so that your users will have access to the information that they need and nothing else - Enable automatic data classification controls across your M365 workloads (Teams, Sh...
Identify Insider Risks and Prevent Data Exfiltration with Microsoft Purview
zhlédnutí 180Před 5 měsíci
Download the ebook"4 -steps to a bulletproof insider risk management strategy" here netwoven.com/services/cloud-infrastructure-and-security/microsoft-purview-services/ebook-insider-risk-management/
Elevating Data Security in Law Firms with Govern 365
zhlédnutí 76Před 5 měsíci
In this webinar, we’ll cover how you can:  - Ensure strong document protection - Securely collaborate on sensitive content in your repository - Manage permissions and remove access for a file, even if it has been downloaded - Prevent screenshots and screen sharing - Block copying, printing of files, and much more - Derive maximum value from all your Microsoft Security products Visit here for mo...
Webinar: Build the Ultimate Intranet and Employee Experience with Microsoft 365 Viva and Search
zhlédnutí 95Před 6 měsíci
In this webinar, you will learn how to: - Create personalized experiences in SharePoint Online, Microsoft Teams & Viva - Surface your modern intranet in Microsoft Teams - Integrate 3rd party application data in your intranet - Configure Microsoft Search to deliver integrated search results from M365, 3rd party applications, and the web - Create custom extensions to truly unleash the capabilitie...
Webinar: Discover, Protect, and Manage Your Sensitive Data Using Microsoft Purview
zhlédnutí 290Před 6 měsíci
In this Microsoft Purviewwebinar part 1, you will learn about: - Understanding your data security landscape - Strategies for Discovering Sensitive Data - Planning your data protection initiative - How to use Microsoft Purview to implement and maximize your data protection objectives - How to measure the success of your data protection initiatives - How to enable your organization to successfull...
Webinar: Data Observability with Microsoft Fabric
zhlédnutí 217Před 7 měsíci
In this webinar, you will learn: - Compare ways to implement Data Observability features with Microsoft Synapse Analytics vs Fabric - Understand the built-in Data Observability features in Microsoft Fabric for better ROI - Learn from Microsoft expert about their new Fabric Implementation Guidelines - Get your questions answered from the Netwoven & Microsoft industry expert panel - Access to a F...
Build a Modern Intranet and Employee Experience in Microsoft 365
zhlédnutí 167Před 8 měsíci
Build a best-in-class Microsoft 365 intranet and digital employee experience for your workforce. Visit here for more information netwoven.com/solutions/business-applications/modern-intranet/
Webinar: The Roadmap to Security Modernization
zhlédnutí 135Před 8 měsíci
Gain valuable insights and expert guidance on securing your business and mitigating cybersecurity risks using the Microsoft security suite. Learn more here netwoven.com/solutions/security-modernization/ In this webinar, you will learn about: - Securing identities, access, and endpoints - Device provisioning, information protection, compliance and governance - Data loss prevention, eDiscovery, r...
Webinar Power BI: More Than Just Visuals
zhlédnutí 202Před 9 měsíci
Lost in the intricacies of #PowerBI data management and reporting processes? Watch our webinar Power BI: More Than Just Visuals to gain valuable insights and expert guidance on analyzing and understanding your data using Power BI. You’ll learn: - New ways to visualize your data using the latest Power BI features - Common reporting solutions for finance and revenue - AI Insights in Power BI - Ho...
Protect Sensitive Information Using Microsoft Purview + Govern 365
zhlédnutí 122Před 10 měsíci
Protect Sensitive Information Using Microsoft Purview Govern 365
Deep Dive into Microsoft Fabric and OneLake with Advanced Power BI Reports
zhlédnutí 276Před 10 měsíci
Deep Dive into Microsoft Fabric and OneLake with Advanced Power BI Reports
The Roadmap to Security Modernization
zhlédnutí 123Před 10 měsíci
The Roadmap to Security Modernization
Migrate Application Authentication from Okta to Azure AD (Microsoft Entra ID)
zhlédnutí 2,1KPřed 2 lety
Migrate Application Authentication from Okta to Azure AD (Microsoft Entra ID)
How to Protect Sensitive Content in M365 using Govern 365
zhlédnutí 126Před 2 lety
How to Protect Sensitive Content in M365 using Govern 365
Creating Virtual Data Rooms Within Your Microsoft 365 Tenant Using Govern 365
zhlédnutí 7KPřed 2 lety
Creating Virtual Data Rooms Within Your Microsoft 365 Tenant Using Govern 365
Govern 365 - The Zero Trust Protection, Compliance, and Experience Solution for Microsoft 365
zhlédnutí 528Před 2 lety
Govern 365 - The Zero Trust Protection, Compliance, and Experience Solution for Microsoft 365
Microsoft Power BI - Introduction to Power BI Service
zhlédnutí 166Před 2 lety
Microsoft Power BI - Introduction to Power BI Service
Microsoft Power BI - Desktop Demo
zhlédnutí 379Před 2 lety
Microsoft Power BI - Desktop Demo
Microsoft Power BI - An Overview
zhlédnutí 404Před 2 lety
Microsoft Power BI - An Overview
Azure AD - Extending AAD Risk Anomaly with Power Automate
zhlédnutí 322Před 3 lety
Azure AD - Extending AAD Risk Anomaly with Power Automate
Azure AD - Switch ServiceNow from Okta to Azure AD
zhlédnutí 1,5KPřed 3 lety
Azure AD - Switch ServiceNow from Okta to Azure AD
Azure AD - Setup Azure AD Terms of Use
zhlédnutí 716Před 3 lety
Azure AD - Setup Azure AD Terms of Use
Azure AD - Break Glass Account for Microsoft 365
zhlédnutí 3,8KPřed 3 lety
Azure AD - Break Glass Account for Microsoft 365
Matthew Maher completes his 15 years with Netwoven
zhlédnutí 302Před 3 lety
Matthew Maher completes his 15 years with Netwoven

Komentáře

  • @richardcollins9862
    @richardcollins9862 Před 2 měsíci

    Excellent presentation

  • @georgewashington3012
    @georgewashington3012 Před 3 měsíci

    Are sensitivity labels truly required for Copilot for M365? Labels have their own issues (compatibility, performance, end user training, etc.). I’m hoping I can simply get by with double checking permissions everywhere prior to rollout as label deployment would be a cultural shift and an even bigger headache than deploying Copilot. Besides, what would be the benefit in deploying labels as it relates to Copilot? We’d want people to feel free to use ALL sensitivities of data with Copilot for M365. After all, one reason it’s so expensive is the robust privacy and security controls built into Copilot for M365. Restricting the types of data users can use with Copilot would only serve to frustrate users and create angry calls to the Service Desk. Same thing for DLP. I realize DLP is useful in its own right, but what relevance does it have for Copilot? It isn’t as though users will exfiltrate data to external storage services/devices/recipients to a higher degree simply because they have Copilot for M365. Employees are expected to exercise good judgement when sending data and Copilot for M365 doesn’t change that.

    • @netwoven
      @netwoven Před 3 měsíci

      Sensitivity labels are not required to use Copilot for M365. We are recommending, however, that you take steps to secure sensitive or confidential information from being accessed and incorporated into CoPilot results. Sensitivity labels are one way to do this, but there are other approaches you can take as well. Double-checking access permissions is a good idea. We would also suggest monitoring access logs for selected content, to confirm that CoPilot is not inadvertently accessing content it should not see.

  • @sudhanshukaushik9980
    @sudhanshukaushik9980 Před 7 měsíci

    Wish I could give you 10 likes myself. Great stuff! Thanks!

  • @richardwaldron1684
    @richardwaldron1684 Před 9 měsíci

    Great explanation and demo, thanks for posting.

  • @aureliaauma2552
    @aureliaauma2552 Před rokem

    kindly send me a link so that i can download the dynamics 365

  • @coolbeing163
    @coolbeing163 Před rokem

    thnx, can we do some manipulation with data while importing. e.g. if my Student ID is "XXX-BAC-StudentID-123" and so on...I want to extract ONLY StudentID-123 and discard prefixes. Can we do that?

  • @emergentform1188
    @emergentform1188 Před rokem

    As a developer who has been doing complex reporting for a variety of industries for 25 years, I can assure everyone of this: These incredibly weak measures offer very little in the way reporting capacity that a company would actually need. SSRS and Power BI are great tools for sure, but the problem is that Microsoft does not allow access to cloud hosted production data. Power BI can connect to the "data warehouse", but that warehouse is incredibly useless and doesn't even work because it's just too slow. In short, Microsoft is SCAMMING you by telling you that it's possible to use Power BI to generate reports (if your D365 database is cloud hosted by Microsoft). And the SSRS option is only available to developers with the knowledge and access to be able to customize D365, and that's will cost a fortune. Microsoft has set up this scam to trick customers into believing that reporting is "easy" but then customers only discover after implementation, when it's too late, that that was all lies. If you want meaningful reporting you need direct access to the database, or have a copy of the data somewhere where you can access it and write database code to do the fetching and transformations. These rinky-dink methods touted by Microsoft are like bringing a bicycle to a formula 1 race. Microsoft knows this. They want to sell report development services so they and their partners have a permanent cash cow. This bait and switch has understandably ticked a LOT of companies off, and so Microsoft will apparently eventually have a way to push D365 to a separate Azure data lake, where it will be accessible for proper reporting, but that's going to come at a price of course, those tools aren't cheap. Sorry to break the bad news to you all. I know this because I've been dealing with the fall-out of Microsoft's lies for the last 3 years, and found a way to work around their numerous roadblocks to generate useful reports. A word of advice to companies thinking of implementing D365: don't. There are far better options.

    • @blackfincloudservices2844
      @blackfincloudservices2844 Před 3 měsíci

      Ahh ... where to begin? First, with empathy. It is true that some reporting choices for Dynamics are a bit daunting and somewhat limited. SSRS is a good example - while it's certainly possible for SOME end-users to take a course to learn how to develop SSRS reports, it's just not realistic that this could be a widespread option for the end-user community. As well, SSRS against the online version of Dynamics does not support SQL queries, so FetchXML must be used, which is also not exactly what we might call intuitive. I will also admit that the performance of some reporting choices are not ideal against big data sets with this platform. In the case of large data sets it would be a good idea to report against a separate source. It's possible to use something like Azure Data Factory to periodically copy sets of data for the use of reporting - and it's not break-the-bank expensive. And while this concept would have been quite expensive in the past, involving a lot of development work, the more graphical tools such as ADF have made the integration process much simpler than it was. Having said that, there are some serious inaccuracies in the polemic above. For instance, is it absolutely possible to connect Power BI to the production version of Microsoft Dynamics Customer Engagement, now known at the Dataverse. Just open either the cloud version of Power BI or the Desktop version and click Get Data - the ways in which you can connect to that data source are well documented - and honestly, by no means rocket science. The idea of the "data warehouse" being the only thing you can connect to related to Production Dynamics - honestly don't k now what you mean by that. As you well know, Power BI can connect to many sources - a given "data warehouse" being just one. Also, as the video explains - what defines reporting? In the 17 years that I have been involved with consulting on this platform, most of the time, the most popular "reports' with many end-users are two-dimensional representations which are essentially excel spreadsheets. And the argument that if you want meaningful reporting you need to export the data to Azure Data Lake and attack with "meaningful" reporting tools - well, now you sound like the developer that you are. So, not sure if it's appropriate advice to end-users or business people who need to make a business decision as to whether or not a system will work for them. Moreover, it is absolutely TRUE that many end-users of Microsoft Dynamics make extensive use of the native charts and graphs against Advanced Find Views to produce perfectly fine interactive dashboards that provide the ability to drill down and answer questions. There are many people who consider those to be reports. What's more, you can now take any view you have created or that the system is presented and simply click "Export to Excel Online" and work on that data in Excel online, click Save and it writes it back to the database. This is an operational advantage that many systems simply do not have. People love Excel. Sorry, but I'm not aware of many other CRM systems where this is possible. So it's not all "lies, ,lies, lies" from Microsoft. I spent 10 years in the open source world, and I still log into Linux on a weekly basis for some clients, but I don't find the need to bash Microsoft and call them "liars." And I have used other CRM systems - some are absolutely great and well-suited to a vertical market. But there is a lot to like with this system - as well as the low-code, and affordably-priced Power Platform, which is essentially Dynamics with specific IP stripped away and presented as a starting point to quickly develop business applications. But I digress ... back to reporting. Power BI is, of course, the choice for more sophisticated reports, and while I agree that there is a learning curve to create Power BI reports - I personally know plenty of end-users who have an analytical bent and have learned to create Power BI reports for themselves against the data in Dynamics. As well, it's pretty trivial embed a Power BI report into Microsoft Dynamics as an embedded dashboard. Every single system has benefits and weaknesses. Microsoft Dynamics 365 has plenty of benefits, and plenty of weaknesses. This is true for all CRM systems. Don't even get me started on Salesforce -- which is madly loved by millions. At the end of the day, none of them should be immediately and completely dismissed. They should be carefully evaluated to determine for your given business whether the benefits outweigh the weaknesses. Take the time to learn, and decide for yourself, for your own business. It's worth the effort to do so.

    • @emergentform1188
      @emergentform1188 Před 3 měsíci

      ​@@blackfincloudservices2844 Wow, I very much appreciate this long and detail comment. Let me give you some background of where I'm coming from. 4 years ago I joined a company already using D365 F&O, cloud hosted by MS. During the installation of D365 they (the Finance staff with no real report dev experience) were told that they could connect directly to D365 data from PBI and generate their own reports. They were also told this was "easy". That's not a lie, it's 2 lies. I had determined, and confirmed twice with Microsoft support on 2 separate occasions (2 tickets), that PBI can only connect to the data warehouse (which is largely useless for serious reporting anyway) and the connection is too slow to even work anyway (except for tiny tables with no more than a few hundred records). Furthermore, even if that did work as promised (and which was even documented in Microsoft's sales materials), the end users lack the skills to do anything meaningful with it. Plus the data warehouse is far too limited for the vast majority of the reporting they wanted anyway (and I also confirmed this directly with Microsoft support - I was pretty relentless in my hounding of them about this actually and was put in direct contact with technical guys who know what's up). I also inquired and pressed our Microsoft partner, a company certified by Microsoft to do D365 installations and support, to pls help me in accessing the D365 database for the purposes of reporting. They didn't know how and couldn't provide an answer either. They just sent me random BS Microsoft articles with information I had already tried and confirmed directly with Microsoft doesn't actually work. So I asked them to open a ticket with Microsoft to inquire as well, and Microsoft's answer yet again was simply: that's not possible. They said that if we were hosting the database locally then the PBI connection to the data warehouse would work, but since it's cloud hosted then not a chance. Not that the data warehouse would have been much use to us anyway, but at the time we had nothing so it would've been better than nothing at least. So as you are likely aware, this lack of data access has seriously ticked off numerous organizations all of the world. Where does Microsoft get off thinking they have the right to sequester a company's own data from them? Especially when they are being paid so much for the service and use of the product. It literally boggles my mind and if I were supreme ruler of earth I would make that illegal. A company's data is it's most precious resource, and for companies to restrict semi real-time access to it is nothing short of crippling and unacceptable. But so many software companies do it and they get away with it because most often the people making the decision to adopt a piece of cloud hosted software don't know to ask the right questions and demand proof of the claims being made around data access. And once the software is installed, a year and millions of dollars later, it's too late to back out and now you're at their mercy. That is the game being played here, I'm quite sure of that. The company I currently work for has been burned repeatedly by this same scam. So with so many disgruntled and misled organizations out there, Microsoft was promising they would FINALLY provide a means to kick out the data. They released that early 2023 I believe it was. It's a mechanism that pushes full actual tables (not that mostly useless data warehouse nonsense) to an Azure data lake (where it's held in csv files). I set that up and tested it and it works great, hooray! Finally we have a way forward. Then oops, not so fast... The data lands in csv files with no column names. The columns name are actually held in separate json files, and Microsoft advises using Azure Synapse to mesh them together and create pipelines to push the data to an SQL database or whatever you want to do with it. Before going further, side note, let me just point out what an incredibly insane amount of work that is. My current reporting uses more than 70 tables, and that's a steadily growing list. That's a LOT of pipelines to set up, schedule, and maintain. Lots of opportunity for things to go wrong too, and having to set up all the datatype transformations since the data is all text in the csv files. Meanwhile, we've had database replication technology for well over 25 years now. What Microsoft is suggesting here is absolutely archaic compared to database replication. Basically, the data lake method completely disassembles the tables into their constituent parts and requires you to reconstruct it yourself, just to make a copy of it in a separate database so you can finally access it (and apply your own indices too of course). What a huge step backwards from database replication. But hey, we're desperate for something close to real time data access, and Microsoft has us over a barrel, so we gotta do it I guess.. So I went ahead and did that, and it turns out it doesn't even work. The specific problem is that Azure synapse (or any other pipeline technology in Azure) isn't able to interpret the date format being kicked out of D365 into the csv files. The dates all come out as nulls. I confirmed this with Microsoft support. However, there is a way to still get that csv data into a pipeline BUT then you lose the columns names. Microsoft's recommendation was to do that and then manually assign the columns names in the pipeline. Not only is that adding an insane amount of additional work to what is already a ridiculously labor intensive and unnecessary development effort, but I'm not sure I'd even trust it, because column names would need to be assigned based on their sequence. If the columns are ever re-ordered or if a new column is inserted then that ETL pipeline breaks and it's a bit of nightmare to track down and fix. I've learned the hard way from many years of experience not to rely on the order of columns in a provided dataset. I just revisited this about 6 weeks ago and lo and behold Microsoft still has not fixed the date issue. Meanwhile their long promised data lake "solution" to the long standing data access issue has been in production for ages now and it still doesn't even work. So to recap, Microsoft appeared to be capitulating to public demand to provide data access but they seem to have pulled another ruse with this archaic data lake solution which remains broken to this day. Did they ever actually intend to provide that access? Or was this just another false promise, leaving many companies with no other option but to get Microsoft or their partner$ to build reports for them with the D365 SSRS methodology (which is also itself extremely archaic and labor intensive compared to using the standard SQL server and PBI). At this point I doubt Microsoft ever intends to provide direct or near real time production data access to D365. Why would they? They have a cash cow going by restricting data access. Dismantling the tables and pushing them out as csv's (with column names held in a separate json file) seems to be all they are willing to do, and then it's up to the companies to undertake the substantial development effort to reconstruct the tables from that. What a disaster. Also I looked into the data verse thing and it's an absolute joke. There's far far too little data available through that for us to do anything with. The vast majority of the tables I need aren't even showing. I'm not even sure what that's even supposed to be actually, or what the point of it is, because it's just so limited. So again I opened a ticket with Microsoft to inquire and they also confirmed what I had observed. It's just not usable for serious reporting. I'm not even sure why it exists at all. I've been at this a long time now. I've spoken to numerous people online facing the same problems due to Microsoft's lies and roadblocks. As of yet I still have not encountered a viable solution to this problem. So I'm left kicking out the database and restoring it locally on my PC and then generating the reports from that. Given that the data lake solution only partly works, and the Azure Synapse pipeline methodology (without column names) just seems too cumbersome, potentially unreliable, and labor intensive to be a viable option, I'm thinking about writing my own ETL application in .Net to do what Azure Synapse is, as of yet, incapable of doing: take the data from the data lake csv files, mesh it with the column names in the json files, and reconstruct the database tables in our own Azure SQL instance. Seems to be the only viable option, and that would probably be less work than setting up all those Azure pipelines. Oh, and another option is tapping directly into the D365 API and pulling the data out that way, but I'm not sure how viable that is for extracting tables with over a million records. Since I have the data available in csv's in Azure blob storage, then I'm more inclined to just use that. However, having said all this, if you feel I'm off the mark about anything I've said here, pls alert me to the error of my ways. I would LOVE to be wrong and for there to truly be a way to access D365 data in close to real time without embarking on the major custom development effort mentioned (and not just the relatively useless data warehouse). To any non-techies reading this, like managers or finance staff thinking of adopting D365, here's my advice: don't. Not only does Microsoft lie egregiously and repeatedly about the data access for reporting, the system itself is generally disliked by everyone in my company. They much preferred Navision actually. I can say with absolute certainty that no one I work with likes it. They find it restrictive and cumbersome, and supporting it costs a fortune too. You can do better.

  • @octaviansfetcu4458
    @octaviansfetcu4458 Před rokem

    Great video. Thanks for sharing! One question how/where do you save/read the change token? Thanks.

  • @Radhe_kanha0
    @Radhe_kanha0 Před rokem

    I need to learn more about nintex, can you please help me out.

  • @juliovilaca2120
    @juliovilaca2120 Před rokem

    Hello, I just created the same flow to generate the Req Number, but when I came back to the form, it apprears like "required" anyway, do not allowing to submit the form. Please your help, thanks a lot!

    • @VincePangan
      @VincePangan Před rokem

      Hi Julio. In your "Requests" list, change the Title field to NOT be required. You may have to update your Nintex form again as well. Let me know if this helps!

    • @juliovilaca2120
      @juliovilaca2120 Před rokem

      @@VincePangan great! It worked well. Thank you

  • @kannibala1
    @kannibala1 Před rokem

    Hi I have used Node JS CLI to add Webhook URI to Sharepoint List/Site. Is there any way to automate it? Like, any powershell scripts to add webhook url to SP?

  • @James-sc1lz
    @James-sc1lz Před rokem

    How do make sure it never expires?

    • @netwoven
      @netwoven Před rokem

      Cloud-only accounts cannot be set to expire, so that is not an issue. Many people are actually asking for Microsoft to implement a way to expire AAD accounts. feedback.azure.com/d365community/idea/5d44d790-c525-ec11-b6e6-000d3a4f0789

  • @Quincypatty
    @Quincypatty Před rokem

    Thank you for the video. However, I think I did something wrong and I dont know what. You showed how to edit the task form at configuring the assigned flexibel task. If i generate a preview or publish it, the original form keeps on edit-mode. How can i change this to read-only?

    • @netwoven
      @netwoven Před rokem

      Are you asking about the form for the SharePoint list item? Or the task form? And how is the user going to access that form? If you are referring to the SharePoint list item form, then you should send a link to the view item (viewitem.aspx). Please email info(at)netwoven.com for more information

  • @leannefleming716
    @leannefleming716 Před rokem

    Thank you this was fantastic, very clearly presented and gives a good overview of the various types of reports

    • @netwoven
      @netwoven Před rokem

      Thanks Leanne. Here are some additional resources you may also like www.netwoven.com/category/modern-applications/dynamics-365/

  • @inatovrustam
    @inatovrustam Před 2 lety

    Thank you. Good educational material.

  • @sawramdhavi2037
    @sawramdhavi2037 Před 2 lety

    in my saleshub display, there is no report tools. how to bring it up?

  • @pasumarthiashik1099
    @pasumarthiashik1099 Před 2 lety

    hello sir , can u provide ur linkedin id ,so i can connect to you regarding nintex.

    • @netwoven
      @netwoven Před 2 lety

      Here's our LinkedIn id www.linkedin.com/company/netwoven-inc-/mycompany/?viewAsMember=true also, you can reach out to us at info@netwoven.com for more information

  • @browngentle1
    @browngentle1 Před 2 lety

    I have the same question as defiant1024. How to get it working with MFA enabled environments?

    • @tamboleo
      @tamboleo Před rokem

      Did you manage to workaround this?

  • @pravinwankhade1835
    @pravinwankhade1835 Před 2 lety

    How to create our own endpoint url... please guide?

  • @alexisjesus1635
    @alexisjesus1635 Před 2 lety

    Hello i need help to compared two diferente date with run if

  • @Catonkey1
    @Catonkey1 Před 2 lety

    Hi, where's the link to setting up Diagnostic Settings?

  • @muzzamilazam
    @muzzamilazam Před 2 lety

    Thanks for making this video. It has helped me a lot as a newbie to dynamics 365 CE.

  • @jlewis6698
    @jlewis6698 Před 2 lety

    Great video. What about excluding the account from any MFA?

    • @netwoven
      @netwoven Před 2 lety

      Great question, we recommend excluding it from any conditional access policies including ones that enforce MFA in case there are any AAD MFA outages

    • @James-sc1lz
      @James-sc1lz Před rokem

      @@netwoven Also exclude AD Sync accounts if you are using it.

    • @valavanchandran8573
      @valavanchandran8573 Před rokem

      @@James-sc1lz Never ADSync

  • @touchtbo
    @touchtbo Před 2 lety

    Aren't Workflows deprecated on SharePoint?

    • @ahmadganteng7435
      @ahmadganteng7435 Před 2 měsíci

      SharePoint 2013 workflow will be turned off for new tenants as of April 2, 2024. It will be removed from existing tenants and will be fully retired as of April 2, 2026

  • @mohammedajeddig6381
    @mohammedajeddig6381 Před 3 lety

    Heey Netwoven. Thanks for the tuto. Have you faced any problems when you try to open nintex workflow on Request List. for me it deosnt load up even. is there any configuration that I should make before? thanks in advance

    • @netwoven
      @netwoven Před 3 lety

      Hey Mohammed, We would love to help you answer any questions you may have. Firstly, have you been able to open Nintex Workflow on any other lists? Secondly, do you see the button for Nintex Workflow?

  • @aniruddhamukherjee5231

    Did anyone say "Steering the Ship" ? Well I can say for sure that this ship has now become a hi-tech aircraft carrier (ready to hit any target), after starting its journey as a paddle boat around two decades back !! Congrats Matt for staying on this 'boat to ship' journey of Netwoven and helping all of us with your extra-ordinary technical contributions all through these years.

  • @sukantaFun
    @sukantaFun Před 3 lety

    A big congratulation Matt!

  • @c016smith52
    @c016smith52 Před 3 lety

    Awesome, thanks so much for this. Very timely too, as I was just researching and trying to do this today. I was able to get it working already, thanks!

    • @netwoven
      @netwoven Před 3 lety

      Glad this video helped! Please let us know if you have any other questions by emailing us at info@netwoven.com or if you would like to see any other topics covered in our channel.

  • @UmeshBeti
    @UmeshBeti Před 3 lety

    @Netwoven, how do you subscribe webook and access Document library? i am trying to get whenever folders and file deleted!

    • @netwoven
      @netwoven Před 3 lety

      Umesh, Web hooks allow for itemdelete events to be notified. When you get the notification and query the change log, you should see the item in the event log have DeleteObject = true; Supported events: docs.microsoft.com/en-us/sharepoint/dev/apis/webhooks/lists/overview-sharepoint-list-webhooks#list-event-types Change Log Delete Item: docs.microsoft.com/en-us/sharepoint/dev/solution-guidance/query-sharepoint-change-log-with-changequery-and-changetoken#use-the-corelistitemchangemonitor-add-in

  • @anjanakrishnaveni611
    @anjanakrishnaveni611 Před 3 lety

    Fantastic video! Good job

  • @anjanakrishnaveni611
    @anjanakrishnaveni611 Před 3 lety

    Amazing video, very helpful! p.s. love your voice

  • @christophernowak3338
    @christophernowak3338 Před 4 lety

    Was hoping for a tutorial, not an ad....

  • @guruprasadmarathe
    @guruprasadmarathe Před 4 lety

    Can i get the ppt?

    • @netwoven
      @netwoven Před 4 lety

      Yes. Can you please drop an email at info@netwoven.com?

  • @oivvv9218
    @oivvv9218 Před 4 lety

    good one Alex :)

    • @netwoven
      @netwoven Před 4 lety

      Thanks Oivv. Do you have anything where we can help?

  • @chanm1000
    @chanm1000 Před 6 lety

    With the approvals, will the approval outcomes be retained? I am noticing that workflows seem to hide this information from view after 90 day or so. Would like approvals to be auditable.

    • @SANTACRUZDRONES
      @SANTACRUZDRONES Před 6 lety

      Yes you can retain all the approvals and make them auditable. Please go to Netwoven.com and contact me Alex Viera, I'll be happy to do a demo of these capabilites.

  • @patenik2
    @patenik2 Před 6 lety

    Awesome info and demo. Much better than PnP video from Microsoft.

  • @Isha3006
    @Isha3006 Před 7 lety

    Thanks for the awesome video. Can we send the notification to localhost instead of the azure web site in your case. It is not working from my end if I do that. Could you please share your code so that I could see what is getting wrong from my end.

    • @webdeveloperify
      @webdeveloperify Před 4 lety

      You can use ngrok which allows your local PC to be available out in internet both in http and https endpoint