Quantcast
Channel: Visual BI Solutions
Viewing all 989 articles
Browse latest View live

Launch your Favourite Netflix Genre with Power BI Custom Visuals

$
0
0

xViz Pro Suite Visuals are completely free with all the pro features unlocked for Power BI Desktop! Leverage the full potential of xViz Suite by downloading them here.

It is the Quarantine period, and a lot of us are taking the couch during the weekends to watch movies and television series on Netflix. Sometimes, it becomes very cumbersome to spend hours of your free time scrolling around the menus in the Netflix sections, only to end up re-watching the same movies and shows again! The streaming service also provides suggestions based on our previous watch history and we fail to discover the hidden gems as we are led on by recommendations in our watch list.

This interactive Power BI Dashboard helps you to identify all the different sub-genres and niche selections that are not visible explicitly inside Netflix.

 

CLICK HERE to view the Power BI Dashboard

 

The enable External URL Feature in the visual helps you to access the Netflix Genres directly from inside the Power BI Dashboard. The steps mentioned below help you achieve it:

  1. Enable External URL Option in the xViz Multi Axes Visual
  2. Click on the Genre Label you wish to browse and confirm Navigation to Netflix
  3. Voila! You are now browsing the sub-categories present in the streaming service

launch-favourite-netflix-genre-power-bi-custom-visuals

Let’s now get an overview of the Power BI custom visual components that are used in the dashboard.

1. xViz Hierarchy Filter
The xViz Hierarchy Filter /Advanced Slicer in the Power BI Dashboard is used to categorize all the genres present in a consolidated format. The Context Menu feature present in the Slicer enables you to Expand and Collapse All the data present and we can also re-arrange them based on topics in alphabetical order for easier navigation.
launch-favourite-netflix-genre-power-bi-custom-visuals

The Enhanced Search Option works seamlessly with high responsive capabilities that provide results in the hierarchies based on the search data. The descendant count is also displayed under each section of the hierarchy so that we get an overall insight on the data present.
launch-favourite-netflix-genre-power-bi-custom-visuals

 

2. xViz Tag Cloud
The xViz Tag Cloud (or Word Cloud) Power BI Custom Visual gives you an intuitive way to represent textual data.
launch-favourite-netflix-genre-power-bi-custom-visuals

 

3. xViz Multi Axes Chart
The xViz Multi Axes Chart provides the code for multiple hidden genres presents inside Netflix. The visual can be typically used to display data in various formats – such as Column, Line, Area, Spline, Lollipop Chart, Area Stacked, and many more.launch-favourite-netflix-genre-power-bi-custom-visuals

***

You can download the Advanced Visuals discussed above and all the Power BI Advanced Visuals of the xViz Pro Suite FREE for Power BI Desktop.

You can take a look at all the advanced visuals in the xViz Pro Suite here. This blog has been originally published on the xViz website. Click here to visit and know more about Advanced visuals for Microsoft Power BI.

Subscribe to our Newsletter

The post Launch your Favourite Netflix Genre with Power BI Custom Visuals appeared first on Visual BI Solutions.


SAP Analytics Cloud – What’s New in version Q2 2020

$
0
0

This blog covers major features and enhancements in SAP Analytics Cloud Q2 2020 and links to detailed blogs on the new features. Let us look into some of the highlights of this release.

 

Key Updates

Here are some important updates:

  • The most sought-after Scheduling and Sharing of Stories / Analytics Applications are possible from this release. The contents can now be shared via email.
  • The much-awaited Android support is released in this version as well.

Data Integration

  • A live connection from SAP Analytics Cloud to SAP Datawarehouse Cloud has been made possible in this release.
  • Data limits are increased for acquired models to massive 100 million cells and 100 columns for each load.
  • Users on a Cloud Foundry tenant and have the Open Connectors account can now have more open connectors data sources like Autotask CRM, Close, ConnectWise CRM, Infusionsoft CRM, Insightly, and Box.
  • Live Universe connections now support Search to Insights, which means you can literally connect to any database, live, through a Universe and then have Search to Insights work on it.
  • Removing the duplicate rows in the acquired date made easy.

sap-analytics-cloud-whats-new-version-q2-2020

 

Visualization

  • You can create custom hierarchies in BW Live connections.
  • You will be able to create widget filters with dynamic time ranges based on a custom current date. But it is still not supported on Geo Maps.
  • New ability to export your stories as PowerPoint saving your time for presentations.

sap-analytics-cloud-whats-new-version-q2-2020

 

  • Pattern search is improved in Input controls to select the members based on a specific pattern rather scroll down the vertical bar.
  • The long-awaited custom sorts are available on the dimension members in tables and charts for the BW Live models.
  • Measure Values can now be displayed on Bubble Layer of Geo Maps.
  • A constant section is a feature that gives the flexibility to ignore conflicting filters for a selected subset which is a part of Bex Query. This ability is brought into SAP Analytics Cloud while creating restricted measures like Bex Query Designer.

 

Analytical Designer

  • The ODATA services in Analytical Designer now have options to define services based on SAP systems SAP BW, HANA and BPC.
  • Exporting the Analytic Application exports the custom widgets that are being used in the application as well.
  • The variables used in the application are by default merged. The scripting APIs removeVariableValue() and copyVariableValueFrom() are introduced for BW and HANA Live connections. With the help of scripting APIs, you can unmerge the variables and set separate variable values for each widget.
  • Accidents happen, do not worry if you delete a widget by mistake you can restore the widget for a few seconds.

sap-analytics-cloud-whats-new-version-q2-2020

 

 

 

 

 

 

 

 

 

 

  • If you want specific tasks to be done before executing a data action trigger/ BPC sequence, then onBeforeExectute event is now available.

sap-analytics-cloud-whats-new-version-q2-2020

 

  • You can now copy widgets across Stories and Analytic Application and across multiple tabs.

 

Planning

  • You can now export Activity Type Plan Cost Rates (ACCOSTRATE) to S/4 Hana from SAP Analytics Cloud.
  • A New function TODAY has been introduced in Advanced Formulas that returns the current UTC in YYYYMMDD format.
  • In cross model copy action in Data Actions, a new option ‘Ignore’ has been introduced that allows you to exclude any source members that were not automatically mapped to a target member.
  • Data locking editor can now be opened from Data locking task in read-only mode.

 

Smart Assist

  • Search to Insights now has Histogram that can be viewed using the keyword ‘as histogram’.

To include a Screenshot

  • Smart Insights can now be used in Explorer and Digital boardroom.
  • Search to insight has been enhanced with support for the following: Numeric decimals, words like “a million”, “half a million”, “m” (for Million) and “b” (for Billion).

sap-analytics-cloud-whats-new-version-q2-2020

 

Other Updates

In addition to the above, there are some minor updates as well, which includes a redesigned adding widget option in Analytics Designer, having the option is display ID or Description or both in an input control and improvements to number formatting in Geo Maps such as scaling, decimals etc.

If you are interested in learning more about SAP Analytics Cloud, check out our series of blogs here.

Subscribe to our Newsletter

The post SAP Analytics Cloud – What’s New in version Q2 2020 appeared first on Visual BI Solutions.

Smart Data Prep in Tableau

$
0
0

Tableau being a highly self-service tool makes use of emerging technologies to provide its users with cutting edge capabilities to perform sophisticated data analysis. In addition to the existing analytical richness of Tableau platform, it also provides various smart analytics capabilities for better usage with respect to data and visualizations. Smart data prep is one of the smart analytics features to enhance competencies with respect to data preparation. 

Have you constantly speculated how emerging technologies like Artificial Intelligence, Machine Learning etc can impact your business decisions when incorporated for analytics purposes? To achieve this breakthrough, Tableau has fused Machine learning capabilities thus making the self-service data preparation features more automatic to ease the endeavours involved in the process of data analysis. This minimizes the manual efforts required to combine, profile, and clean the data which facilitates arriving at decisions from the data more rapidly. There are 3 concepts when it comes to smart data prep and they are as below: 

  • Data interpreter 
  • Fuzzy matching  
  • Smart recommendations 

Data Interpreter

Data interpreter is a straightforward approach that aids in arriving at a faster analysis which automatically detects the sub tables which enables users to pivot and split data from the added data source. It identifies the structure of the data, the titles, footers, values and so on and parses to convert them into a proper format for improved analysis. During the data interpretation tableau never makes changes to the underlying data source rather just cleanses and modifies the data loaded. 
smart-data-prep-tableau

Enabling the highlighted option ‘Data Interpreter’ in the data source page displayed after loading the data can interpret and cleanse the data. Tableau also provides an option to review your results after cleansing by the Interpreter. The results can be viewed as an excel where the keys are mentioned for us to understand how our data source has been interpreted. The data, which is construed as columns, data values, values of merged cells, and excluded values are highlighted appropriately in the result sheets. It also cites if a value is a header or data value as a separate field.
smart-data-prep-tableau

This data interpreter works based on the decision tree built internally for the purpose of determining which data is to be included and excluded. Tableau will also identify sub-tables available in the data if any which can be used just like other data in your workbooks.

 

Fuzzy Matching

There may be instances during data prep such that few fields require clean-up to be used for better analysis. Fuzzy matching, a method that has the ability to process word-based matching, indexes, and groups the values that are related by pronunciations or common characters in Tableau Prep. This is advantageous during data prep as it eases manual work during data cleansing.

Click on the icon at the right corner of the field name where it displays an option ‘Group and Replace’ once you add a step in the flow in Tableau Prep.
smart-data-prep-tableau

 

Tableau groups the values based on the following which are listed on clicking the ‘group and replace’ option as below:

  • Manual selection
  • Pronunciation
  • Common Characters
  • Spelling

smart-data-prep-tableau

 

The values after being grouped based on the requirements are denoted by a ‘clip’ icon.
smart-data-prep-tableau

 

Tableau employs efficient and powerful algorithms and machine learning capabilities to achieve the fuzzy matching feature.

Smart Recommendations

Tableau mines the existing data connection patterns and recommends data sources and joins based on the usage of your organization which is achieved using machine learning capabilities employed in the background. Also, dimensions of data are recommended based on user’s past personal data consumption patterns. It takes into consideration the usage metrics of other users when no history is available for a specific user.
smart-data-prep-tableau

 

The data source recommended can be viewed on the Tableau server and it can be added directly as a data source by the ‘Add Data Source’ option.

 

Advantages of Smart Data Prep

Using the features of Smart Data Prep reduces the manual work involved in data preparation activities by the user which enables customers to see more automated cleansing and formatting options. ML capabilities employed to achieve these functionalities makes it more efficient to perform your analytics more effectively. This improves the self-service experience of the user which in turn aids in arriving at quicker data-driven decisions.

To know more about the Smart Analytics features provided by Tableau check out our blog here. Reach out to us for enhanced data analytics capabilities to drive your organization based on data-driven decisions.

 

Subscribe to our Newsletter

The post Smart Data Prep in Tableau appeared first on Visual BI Solutions.

Connecting Python with Tableau

$
0
0

In this Internet era, the amount of data being generated from various sources with detailed granularity is humongous. Data being the new gold, it has become inevitable to use data for advanced analytics purposes to understand useful business insights for the betterment of one’s organization. Python and Tableau are among the best technologies addressing analytics challenges. This blog briefs about both the tools and walks us through the steps to connect to Python from Tableau. Before discussing Tableau and Python integration let us discuss Tableau and Python.

 

Python

Python is an object-oriented, open-source, high-level programming language with dynamic semantics. Though Python has many high-level data structures it is very user-friendly and simple to learn. Python also supports packages, modules which are useful to increase program modularity and allows code to be integrated and reused with other technologies.

Tableau

Tableau is a self-service data visualization tool that helps customers to view and present data in the form of interactive dashboards and charts to showcase insights and perform real-time data analytics. Tableau is very user-friendly as it provides a drag-drop user interface to visualize the available data with minimal scripting required only for calculated fields.

Prerequisites

Connecting Python to Tableau requires us to install  Python IDE(Anaconda Navigator) which is also an open-source tool.

How to integrate Python with Tableau?

TabPy is the API that enables the working of Python code from within a Tableau workbook.

Steps to integrate Python with Tableau:

1. After downloading Anaconda Navigator, the next step would be to download the Tabpy server. TabPy server can be downloaded by typing conda install -c anaconda TabPy-server in the anaconda prompt.
connecting-python-tableau

After all the packages are installed it will ask for yes or no to proceed, press y to install the server.
connecting-python-tableau

2. After the TabPy server is installed, the server should be started in order to connect with Tableau. To start the TabPy-server we should change the directory from the root directory to the folder where TabPy-server is installed. This can be done by typing cd C:\Users\*your username*\Anaconda3\pkgs\tabpy-server-0.2-py37_1\Lib\site-packages\tabpy_server command in anaconda prompt. This command changes directory to the folder where tabpy_server is installed.

3. The next command startup.bat can be typed to start the server.
connecting-python-tableau

4. After the command startup.bat, the prompt displays port number 9004 on which TabPy-server is initialized. After initializing the server, the next part is to connect the server with Tableau. Open Tableau desktop.

5. In Tableau go to:

  • Open Help menu.
  • In that choose settings and performance
  • From settings and performance choose to manage the external connection.
  • Select TabPy/External API
  • Select localhost
  • Make sure the port number is 9004

Click on test connection to cross-check the connectivity.
connecting-python-tableau

Once the connection is successful then click on the OK button to approve the external connection.

 

Why Python+Tableau?

When Tabpy is used with Tableau, calculated fields can be defined in Python which enables us to use the power of many machine-learning libraries right from Tableau visualizations. It enables many new features like Machine learning predictions, sentimental analysis, and time series forecasting using various models by customizing calculated fields.

 

Limitations of integrating Python with Tableau

Though there are many advantages of enabling Tabpy there are also certain limitations.

  • When a large dataset is used then waiting time will be more while the script runs each time you make a change to the view.
  • The Python generated calculated fields will not be extracted if you create a Tableau extract.
  • The Python script will run only when you put it into the view.
  • You cannot use the Tabpy calculations to create values and base additional calculations on those values unless you can use both calculated fields in the view.

When deployed together, Python integrated with Tableau can help in delivering scalable, flexible and advanced analytics platform.

To learn more about Visual BI’s Tableau Consulting & End User Training Programs, contact us here.

Subscribe to our Newsletter

The post Connecting Python with Tableau appeared first on Visual BI Solutions.

Utilizing Azure Log Analytics Workspace for Azure Storage Account Logs

$
0
0

Logging is a crucial administrative task as it helps identify details of an event that occurred. Logs usually store the details such as the username, time, action of the user and metadata of the event. It is helpful for auditing and for forensic examination in the event of a crime.   

Azure provides various monitoring tools that help identify resource usage and bottlenecks. Azure Storage provides logging feature that gives information on the events that occurred on the storage account.

 

Azure Log Analytics Workspace 

Log analytics workspace is a service provided in Azure that enables us to collect logs from multiple services like an Azure Storage account and Azure Virtual Machines. The logs collected based on events can then be queried using a custom language called KQL (Kusto Query Language)KQL also is known as ‘Log Analytics Query language’ is like SQL with the additional capability to render charts.
utilizing-azure-log-analytics-workspace-azure-storage-account-logs utilizing-azure-log-analytics-workspace-azure-storage-account-logs

You can add various types of events for loading into the Log Analytics workspace, and then combine it in the dashboard tiles.

Thus, Log Analytics Workspace provides a single place where you can store logs from different services, query them and build a dashboard from it.

Azure Storage Account

Azure Storage Account provides a storage platform on the cloud enabling us to store various kinds of data. Data can be stored as blobs, tables or queues. Lots of read/write/delete operations usually occur on the storage, and you might need to keep track of who is doing what.

To enable logging on an Azure Storage account, open the respective storage account. Go to Monitoring (classic) – > Diagnostic Settings (classic), select the version and check the operations you need to log (read/write/delete).
utilizing-azure-log-analytics-workspace-azure-storage-account-logs

Azure Storage provides two versions of logging and v2.0 is just a superset of the v1.0. The log generated contains the following details -> resource ids, request type, the operation performed, operation status and network information like header size, authentication. v2.0 contains more details with respect to Unique IDs (UUIDs) of all the entities tied to the event.

The logs generated can be seen in the Azure Storage explorer under ‘$logs’ in the corresponding storage account. At the time of publishing this blog, ‘$logs’ is not visible in the preview version of Azure Storage Explorer in the Azure portal.

The log files are organized in a year/month/day folder structure and the file contents are ‘;’ separated. The log files can be downloaded and analyzed in your favorite tool or can be automatically imported into the Log Analytics workspace.

Loading log files into Log Analytics Workspace

At the time of publishing this blog, there is no direct way to connect ‘$logs’ to the analytics workspace. Microsoft has provided a PowerShell script that can be run to fetch logs and post them in the workspace.

https://github.com/Azure/azure-docs-powershell-samples/blob/master/storage/post-storage-logs-to-log-analytics/PostStorageLogs2LogAnalytics.ps1

Steps to load data:

  1. Download the PowerShell program from the link provided above. Using ‘Powershell ISE’ to run the program is recommended.
  2. Insert your respective ids at the top of the program. The details of getting the ids have been provided in the comments on top of each variable.
  3. ‘$LogType’ is the name of the table that will be created for the logs from this storage account. The table will also append ‘_CL’ during creation.
    utilizing-azure-log-analytics-workspace-azure-storage-account-logs
  4. Once you have inserted the required fields, run the program and it should import all the logs to the workspace. You can automate this using the Azure DevOps.

Querying Log Analytics Workspace

Once the logs are imported, open the Log Analytics workspace, select ‘Logs’ in the left pane and you should see your logs under the Custom Logs hierarchy. To query, you need to use the KQL (Kusto Query Language) which is like SQL.

Consider gen2_logs_CL is my custom log table and I need to select Operation_Type. In SQL, we would write it as below:

SELECT Operation_Type FROM gen2_logs_CL

In KQL:

gen2_logs_CL | project Operation_Type

In the below image, we have grouped the Operation type and created a pie chart to see which operations are the most common in the storage account. The render command is specific to KQL and is used to produce a chart from the output of the query.
utilizing-azure-log-analytics-workspace-azure-storage-account-logs

Dashboards in Log Analytics workspace allow us to add the various queries we create across different services to be added in a single place. This allows us to get a quick look at the logs of all the services.

After creating this, you can add the pie chart to a new dashboard or an existing dashboard within the Log Analytics Workspace which will automatically update every time the custom log table is updated.

Thus, by using Azure Storage Analytics and Log analytics workspace, we can derive useful insights into the events that happen in the Azure Storage Account.

Reach out to us for implementation details or questions. Learn more about Visual BI’s Microsoft Azure offerings here.

 

Subscribe to our Newsletter

The post Utilizing Azure Log Analytics Workspace for Azure Storage Account Logs appeared first on Visual BI Solutions.

Overview of Azure Synapse

$
0
0

Azure Synapse is the evolution of Azure SQL Data Warehouse that is expected to bridge the gap between data lakes and data warehouses. Azure Synapse focuses on integrating all the analytic capabilities into a single service. It brings Enterprise Data Warehousing and Big Data Analytics together.  Azure Synapse is the industry’s first enterprise-class cloud data warehouse that can grow, shrink and pause in seconds. We can query huge amounts of information either serverless on-demand or with provisioned Azure resources. Serverless deployment is a type that automatically scales for storage and commute power. 

How does it work

Synapse uses node-based architecture. Synapse works on architecture that distributes compute power of data across multiple nodes. The compute and storage nodes are independent of each other and the number of compute nodes ranges from 1 to 60. The unit of compute power is called a Data Warehouse Unit (DWU).
overview-azure-synapse

It is evident from the architecture diagram that the control node is a single point which connects with all applications. It optimizes queries to run in parallel with its Massively Parallel Processing Engine thus passing the queries to compute nodes to run queries in parallel. Synapse uses the Data Movement Service,  an internal service that automatically moves data across compute nodes.

One of the key features of Azure Synapse is the independent Compute and Storage facility that allows us to scale the compute power up or down without any data loss and also grants the pause option on the compute thus enabling us to pay only for the storage.

Some of the features of Azure Synapse that make it unique and welcoming are discussed here.

Distribution

Azure Synapse storage uses various distribution techniques to optimize the performance of the system. In the Compute node, each query distributes the work into smaller queries that range from 1 to 60 to help them run in parallel. The three distributions are

Hash –  Highest query performance for joins and aggregations on large tables

Round Robin – Simple and quick to create

Replicate – Fast query performance for small tables

 

Workload management

Azure SQL Data Warehouse did not have an effective way of managing the workloads after creation.The workload management in Azure Synapse comes with the following concepts to enable us to have more control over how the workload utilizes system resources.

  • Workload Classification
  • Workload Isolation
  • Workload Importance

The portal lets us monitor query activity and resource utilization in workload groups.

Security

Azure Synapse comes with the following techniques to handle security.

  • Firewall rules – Server-level IP firewall rules
  • Connection Encryption
  • Authentication – SQL Server and Azure Active Directory
  • Authorization privileges – Using roles and permissions
  • Data Protection – Dynamic masking and Transparent Data Masking

In addition to the above techniques, Azure Synapse has an advanced security option for highly sensitive data. Advanced data Security helps us discover and classify sensitive data. Advanced threat protection monitors the database for threats.

 

Analytics with Azure Synapse

Azure Synapse differs from all other cloud data warehouses in its unified approach of warehouse and analytics services. We can ingest, prepare, manage and serve data for immediate BI including machine learning needs with Azure Synapse. Azure Synapse has four components.

  1. Synapse SQL
  2. Spark
  3. Synapse Pipelines
  4. Studio

overview-azure-synapse

We have covered Synapse SQL which is generally available with Azure SQL Data Warehouse. Azure Synapse offers 85+ connectors to load data. In addition, Azure Synapse can have a Spark environment in Notebooks which is similar to Databricks where it supports multiple languages like Pyspark(Python), Spark(Scala), .NET Spark(C#) and Spark SQL.

The data can be orchestrated in Azure Synapse using Azure pipelines and transformation can be done with Data Flows in Azure Data Factory where operations like aggregation and joins can be done. In this way, Azure Synapse comes as a whole set of ELT tools along with additional analytics features including Machine Learning and visualization with Power BI.

With Azure Synapse Studio, Azure Synapse brings a single workspace for data professionals to work with their data.

Reach out to us for implementation details or questions. Learn more about Visual BI’s Microsoft Azure offerings here.

Subscribe to our Newsletter

The post Overview of Azure Synapse appeared first on Visual BI Solutions.

Analysis Authorization in BW/4HANA Cockpit

$
0
0

SAP BW/4HANA cockpit is a Web-based interface used for administration of a BW/4HANA system. In BW/4HANA cockpit under the Modeling tab, you will find a tile named ‘Analysis Authorizations Editor’. By clicking on that we can restrict transaction data that is accessed by a large group.

Analysis Authorization helps us to create authorization variable for characteristics to restrict accessing the values for those characteristics. Analysis authorization can also be applied to a navigation attribute in a query. Below are steps to create the authorization object on BW4/HANA cockpit or using the app.

Step 1: Click on the create add button which is on the left side on the page.
analysis-authorization-bw-4hana-cockpit

And then there will 3 tabs such as,

  • General
  • Special characterises
  • Characterises

 

General Tab

Step 2:  In the General tab, we mention the technical name and description of the authorization object that we want to create.
analysis-authorization-bw-4hana-cockpit

 

Special Characteristics Tab

Step 3: In the Special characteristics tab, first, we can either choose “use the default” property or select properties based on our requirement.
analysis-authorization-bw-4hana-cockpit

When “use Default” is selected it automatically selects full validity, full info provider and only display activity.

If we want to have this authorization object to be valid for only a specific period, we can de-select full validity and click on expert mode to enter the validity period in the validity tab.
analysis-authorization-bw-4hana-cockpit

Step 4: If we want to have the authorisation only for a few key figures or for a few ADSO’s we can de-select full and give the key figure technical name in the info provider tab.
analysis-authorization-bw-4hana-cockpit

Step 5: We can also give either display activity or change activity for the authorization object, which decides whether the user with this authorization can change or only view the data.
analysis-authorization-bw-4hana-cockpit

 

Characteristics Tab

Step 6: In the characteristics tab, we can mention a specific info object and give value based on which we want to restrict.
analysis-authorization-bw-4hana-cockpit

Step 7: After entering the info object we can further restrict by choosing to display only the aggregated values or the entire data. Mentioning single or multiple values and if hierarchy exists, we can restrict up to which node the data can be drilled down.
analysis-authorization-bw-4hana-cockpit

Step 8: After entering also details click on save and activate the object. We can also transport the authorization object, so we need to capture a transport log to move the object to other systems.

After creating the authorization object, we need to assign the object to the user in SU01 and enable the authorization object in the query to restrict accessing the data. We can choose to deactivate an existing authorization object if we no longer need the authorization for a query or info provider and remove the authorization filter from the query.

Similarly, we have many other apps grouped together in BW/4HANA Cockpit to perform specific operations in BW/4HANA system.

 

Read more from BW/4HANA Cockpit category here. Know more about Visual BI Solutions SAP/BW Services offerings here.

Subscribe to our Newsletter

The post Analysis Authorization in BW/4HANA Cockpit appeared first on Visual BI Solutions.

Configuring SAP BW/4HANA Cockpit on Mobile phones or Tablets

$
0
0

SAP BW/4HANA cockpit is a Web-based interface used for administration of a BW/4HANA system. The main aim of launching a sap Fiori app is to use sap on the go with minimal user interaction. The app is built mainly for frequently used functions. The app helps us to perform standard SAP transaction in our mobile, view information about business objects and to measure the performance.

Furnished below, are the steps to install and configure the app in your phone/tablet.

Step 1: Install the SAP FIORI app from the play store or from the App Store if you are an IOS user.
configuring-sap-bw-4hana-cockpit-mobile-phones-tablets

 

Step 2: On opening the app that installed you will see the below screen. Click on Login button.
configuring-sap-bw-4hana-cockpit-mobile-phones-tablets

 

Step 3: On click on the login button, the app will take you to screen where you can enter either the URL or scan the barcode or enter your work mail id to configure your sap BW4/HANA system to the app.
configuring-sap-bw-4hana-cockpit-mobile-phones-tablets

 

Step 4: After the system is connected with the app it will ask for a security passcode, you can either have a characteristic enabled passcode or if your phone or tablet is fingerprint enabled you can enable fingerprint as the passcode.
configuring-sap-bw-4hana-cockpit-mobile-phones-tablets

 

Step 5: After the passcode is set the app will directly take you to the Login screen where you need to enter the logon credentials.
configuring-sap-bw-4hana-cockpit-mobile-phones-tablets

 

Step 7: After the credentials are verified it will directly take you to the home screen.
configuring-sap-bw-4hana-cockpit-mobile-phones-tablets

The key difference between a web portal and mobile app are you will not have the ability to view the metadata repository and transport-related operations. The data tiering maintenance and viewing and maintaining the linear store data connections property is also disabled in the mobile app.

Read more from BW/4HANA Cockpit category here. Know more about Visual BI Solutions SAP/BW Services offerings here.

Subscribe to our Newsletter

The post Configuring SAP BW/4HANA Cockpit on Mobile phones or Tablets appeared first on Visual BI Solutions.


Workspace Tools, Designer and Query Designer

$
0
0

SAP BW/4HANA cockpit is a Web-based interface used for administration of a BW/4HANA system. In BW/4HANA cockpit under the Modeling tab, you will find the following tiles to work on anything related to Workspace. 

 

Workspace

In BW/4HANA Version 2.0, BW Workspace is a dedicated area in a BW system where new models can be created based on central BW and local data (e.g. flat files). The workspace can be created in BW Environment using the TCODE – RSWSP. Enter a name and click create. workspace-tools-designer-query-designer

 

Workspace Tools

The workspace tools are used to maintain the workspaces available in the BW environment. The needed workspace should be selected from workspace tools before being used in workspace design.

It has 3 options:

  1. Change Workspace
  2. Unlock Workspace
  3. Clear Workspace

 

Change Workspace

As the name suggests it is used to choose between or change the existing workspace. Respectively needed workspace can be enabled before working on the workspace design.

 

Unlock Workspace

It helps to unlock a workspace if it’s been locked by some user, ensuring work is not stopped due to workspace being locked.

 

Clear Workspace

It is used to clear the available composite and local providers, local data and local queries that we have created in a workspace. We can search for the details in the search option and proceed to clear the same. Here we can select the delete option to clear that provider from a workspace.

 

Workspace Designer

It is the environment where we make local providers manipulate data to check whether it affects or does not affect the existing model. It is used when new scenarios are needed in existing providers and enables the user or even developers to check the data in development with the local provider. If everything looks fine, then the same can be implemented in an actual provider and moved across to production. It enables the user to test different needed scenarios.

It has the following options available:

  1. Local CompositeProviders
  2. Central Providers
  3. Local Providers
  4. Local Data
  5. Local Queries
  6. Settings

workspace-tools-designer-query-designer

 

Local Composite Providers

It is created to do modifications based on central BW objects. It has the following options,
workspace-tools-designer-query-designer

 

The create option is used to create a local composite provider. On clicking the option, It enables the user to create a composite provider based on a query or general provider. Click continue, It will show an option to select a provider based on the available providers and we should provide a name, description for the same. Click Next,
workspace-tools-designer-query-designer

 

Now we need to model the composite provider and provide the join types, in the details of links specify the join field.
workspace-tools-designer-query-designer

 

Click Next, You can edit the fields based on the requirement. Click Next, You can create a query for this provider or also you can create SAP HANA View for this provider.

Click Next, You can save and activate the local composite provider. Now its created.

Upon selecting any local composite provider, The other options are enabled. You can change the details of a provider, or delete if not required, use the display data option to check the data of the provider, inform users by emailing them the local composite provider, and the last check its where-used list.

Central Providers

The provider that Is specified as the main provider in Workspace is the central provider. It is used to display data of the provider, its where used list or even to create a template file of the data in CSV format for the local system.
workspace-tools-designer-query-designer

 

Local Providers

It is to create reload etc for locally created providers. Its available options are,
workspace-tools-designer-query-designer

 

Create

It is used to create a provider based on the file, data source, query and agile data preparation. We need to select the required option and create the same.

The provider properties like enabling planning or auditing or appending or deleting data are the same for all types.
workspace-tools-designer-query-designer

 

For File-based, Upload the file from the local system and specify the header with a delimiter.
workspace-tools-designer-query-designer

 

For Query Based, Select the needed query and proceed. For Datasource based, If the data source is mentioned while creating workspace then those can be used in this.

For Agile data provision, If we have the remote source provision, we can create and assign a name and description for the same.

 

Local Data

It is used to create local characteristic or hierarchy to be used in a local provider.
workspace-tools-designer-query-designer

We can select the needed option and the characteristic based on that master infoobject will be created. We can add the attributes if needed.

Local hierarchy is to create a hierarchy based on the infoobject present. We need to mention the internal Hierarchy ID.

 

Local Queries

It is to create queries for local providers or data. It helps in displaying the output for the changes made.
workspace-tools-designer-query-designer

The change and delete option are to either change the query or to delete it depending on need. To create a query, Start workspace query Designed is clicked.

 

Workspace Query Designer
workspace-tools-designer-query-designer

Here, we can create a new query or open existing query. It has options to do the same based on info provider and also to change workspace if needed.

Upon giving create a query, the providers are listed that are available in the workspace and we can select one as needed. We are provided with the query designed panel,
workspace-tools-designer-query-designer

We can design based on our need and click save button. With an appropriate name, the query is saved. It is available in the workspace designer.
workspace-tools-designer-query-designer

 

Settings

It shows the global setting that is given during the creation of workspace in the BW environment.
workspace-tools-designer-query-designer

 

Thus, using workspace enables users as well as the developers to practically implement any required changes in Development and check on the outcome before applying it in the production. Similarly, we have many other apps grouped together in BW/4HANA Cockpit to perform specific operations in BW/4HANA system.

Read more from BW/4HANA Cockpit category here. Know more about Visual BI Solutions SAP/BW Services offerings here.

 

Subscribe to our Newsletter

The post Workspace Tools, Designer and Query Designer appeared first on Visual BI Solutions.

Remodeling Requests in BW/4HANA Cockpit

$
0
0

SAP BW/4HANA cockpit is a Web-based interface used for administration of a BW/4HANA system. In BW/4HANA cockpit under the Monitoring tab, you will find a tile named ‘Remodeling Requests’. Remodeling is done when there is any structural change made in ADSO. It is generally done in RSMONITOR, but this version enables us to do the same without the GUI use. It enables a new modeling approach (Field / Info object-based modeling).

Some of the benefits of remodeling in the version BW4HANA 2.0 are,

  1. It helps in enhancing or changing the data models without reloading.
  2. It takes low remodeling effort and easy to use.
  3. It has high productivity.

Available Remodeling options
remodeling-requests-bw-4hana-cockpit

 

1. Replace by info object

If this option is selected in the, it enables to load the data to the selected info object/field with the way data is loaded in the given info object name. In short, it defines how to load data and replaces the same with the info object.

2. Replace by field

It is to replace the selected info object or field with the name specified and the data. The field being given should not be used in the ADSO.

 

3. Fill with a value of info object

This enables to load the data from the specified info object and that info object should be present in the ADSO.

 

4. Fill with the value of the field

This enables to load the data from the specified field and that should be present in the ADSO.

 

5.  Filled by a constant value

It is like filling constant value in transformation for an info object. Here the value to be inserted is mentioned. Once the changes are mentioned, the ADSO is activated. It requests for remodeling of the ADSO which can be done in BW4HANA Web Cockpit.remodeling-requests-bw-4hana-cockpit

It is available in monitoring tab of the cockpit.remodeling-requests-bw-4hana-cockpit

 

We can find the request available in the following list,remodeling-requests-bw-4hana-cockpit

On selecting the request and it will take us to remodeling step.remodeling-requests-bw-4hana-cockpit

The options available are,

RUN: Option is used to start the request.

RESET: Used to reset the whole request.

RESTART: Enables to restart the request if any issue has occurred and been solved.

RESET STEP: It resets the specific step for the run.

REFRESH: To refresh the request and check for status.

On Pressing the RUN button, it asks the developer if the request needs to be run immediately or schedule the run.remodeling-requests-bw-4hana-cockpit

After the completion of the remodeling, it looks as follows,remodeling-requests-bw-4hana-cockpit

As it’s discussed, delete and loading of data are not required. The data is available in the ADSO.remodeling-requests-bw-4hana-cockpit

Similarly, we have many other apps grouped together in BW/4HANA Cockpit to perform specific operations in BW/4HANA system.

Read more from BW/4HANA Cockpit category here. Know more about Visual BI Solutions SAP/BW Services offerings here.

Subscribe to our Newsletter

The post Remodeling Requests in BW/4HANA Cockpit appeared first on Visual BI Solutions.

Top 8 new features in Power BI – you shouldn’t miss out

$
0
0

With the recent updates for Power BI, storytelling with data has become even more easand exciting. Let us look at these muchawaited features. 

 

1. Personalised visuals

Personalised visuals aid the end-users of the report allowing them to customise and explore a visual based on their specific use-case. This allows them to change the type of visual and swap fields, measures and aggregations. This feature can be toggled on/off for each visual and for the report page in general.
top-8-new-features-power-bi-shouldnt-miss

 

 

2. Deployment Pipelines

With the ever-widening need for analytics and data, it becomes essential to develop an efficient and reusable deployment pipeline for faster delivery of content and reduce manual work and errors.

To address this gap, Power BI has introduced Deployment pipelines as a preview feature in its latest release. This enables enterprise BI team to configure development, test and production environments and incrementally transition content between environments. Currently, this feature is available only for Power BI Premium capacity users.

 

3. New action types for buttons

 

a. Page navigation:

For any Power BI report, page navigation becomes an integral part of the design. Until now, bookmarks must be created, managed and integrated into buttons as a work-around to mimic page navigation experience. With the latest update, the number of steps gets reduced significantly with new page navigation action type for buttons. In this Page navigation button, destination page can be selected directly, without the use of bookmarks. Conditional page navigation can also be setup to optimize space.
top-8-new-features-power-bi-shouldnt-miss

 

b. Drill through:

Drill through scenarios does not have the required visibility for the end-users in some reports. In order to enhance its discoverability, a new Drill through action type is introduced for buttons.  To make use of this feature, a drill through page must exist which can be selected in the destination tab.

 

4. User Experience – Drop shadows and fill images

Power BI has introduced two new features that greatly enhance the user experience. Fill images support has been introduced for buttons. This combined with built-in button states can help developers create a better interactive experience in the report.

In addition to this, Power BI now has support for shadows in all the visuals. This gives a much-needed modern UI feel to Power BI reports. The drop shadow option is highly customisable with settings for size, blur, distance and transparency.
top-8-new-features-power-bi-shouldnt-miss

 

5. Customize themes

Until recently, custom themes in Power BI could only be created by writing or modifying a JSON file. With the new feature from Power BI, customizing your current theme has never been so simple. Most of the options in your theme can now be customized by selecting the Customize current theme option under the theming dropdown. Color scheme, font family and sizes, visuals, page settings and filter pane options can be formatted through this dialog.
top-8-new-features-power-bi-shouldnt-miss

 

In addition to custom themes, Power BI has also released a feature to export the current theme of the report. This feature comes handy to use your favorite theme across multiple reports with ease. Export current theme option is available under the theming dropdown.

 

6. Multi-column sort

Several business use-cases demand to sort by multiple columns for ease of understanding and perception. With its latest release, Power BI has resolved this by including a multi-column sort option for tables. Shift + click enables you to include multiple columns under sort and changing the direction of sorting.
top-8-new-features-power-bi-shouldnt-miss

 

7. Conditional formatting for totals and sub-totals

Conditional formatting of values in table and matrix visuals are crucial for communication of the right insights to the end-users. This feature was unavailable for totals and sub-totals, making them stand as an anomaly in a report. To sort this, Power BI has at last introduced this much-awaited feature, enabling conditional formatting for totals.

 

8. Change detection for auto-refresh

Page refresh for DirectQuery sources can be scheduled based on a change detection instead of a pre-defined time interval. This change detection can be based on a new measure or an existing measure in the model. Page refresh occurs only when the change detection measure returns a new value.

For example, in a transactional database, if the count of the transaction ID is defined as a measure, the report refreshes only when a new transaction is added to the table.

Learn more about Microsoft Power BI services offerings from Visual BI solutions here.

Subscribe to our Newsletter

The post Top 8 new features in Power BI – you shouldn’t miss out appeared first on Visual BI Solutions.

Web Scrapping and Sentiment Analysis in Power BI

$
0
0

In any e-commerce website, the overall rating allocated to a product is based on the individual reviews by the customer. Lately, these Ratings/Reviews have become a key factor in determining whether a customer will buy a product or not.  Using built-in features of Power BI like AI Insight, sentiment analysis and web scrapping reviews along with the product specification can be managed, studied and analysed.  Helping a great deal with the decision-making process.

In this blog, we are going to dynamically retrieve the review of mobile phones from Amazon and run sentiment analysis to understand if the product is preferred by the customers.

web-scrapping-sentiment-analysis-power-bi

Note:
Before proceeding with web scrapping and any AI analysis ensure New web table interface and AI insights functions browser in preview features under options are enabled.

 

Web scraping product details

By Selecting the data source as Web, PowerBI simplifies retrieving contents from any webpage provided we have its corresponding URL. Since we are interested in retrieving the mobile phone reviews and its details, make a simple search of the required model of mobile phone in Amazon, retrieve its URL and enter it in the dialogue box that pops up when you select web as your data source.
web-scrapping-sentiment-analysis-power-bi

When you click ok, a connection to the webpage is established and the entire contents corresponding to the URL will be pulled into PBI as an HTML source. As a result, tables in the HTML will be available as a data source

For example, Table 5 displays ratings of the item:
web-scrapping-sentiment-analysis-power-bi  web-scrapping-sentiment-analysis-power-bi

 

However, we are interested in collecting reviews of customers. These reviews are not available as HTML tables but stored under a certain HTML class. To retrieve the data, PowerBI provides the ability to Add Tables using Examples.
web-scrapping-sentiment-analysis-power-bi

 

When we select this option, we will be able to add new columns and generate values for it based on the sample data. Using this sample data, the values that map to the subsequent data hooks under the same class are automatically retrieved by Power BI. By adding a new column Reviews and including some sample of data, all the other subsequent data gets automatically filled by Power BI.
web-scrapping-sentiment-analysis-power-bi

 

Likewise, we add two more column, Ratings and Review Text that includes the comments given by the user of each review. The same process can be repeated to extract more contents from the website.
web-scrapping-sentiment-analysis-power-bi

Sentiment Analysis

Once we have the source tables ready, the next step is to apply sentiment analysis over the contents web scrapped from Amazon. The Text Analytics associated with the AI Insights feature of PowerBI allows its users to build the sentiment analysis model. It utilizes Azure Cognitive Services to obtain the sentiment score.

Sentiment analysis considers a text input and runs a machine learning algorithm which assigns a score ranging from 0 to 1. A score of 0 indicates negative sentiment and 1 indicates positive sentiment. The model in cognitive services is trained with a dictionary of texts that are mapped to its corresponding sentiments. This training model is considered to assign the sentiment score for the reviews extracted.

The scores are generated in a new column which can be classified into positive (> 0.5), negative (<0.5) and neutral (=0.5).

 

Enabling Parameters to the URL

As the final step, create a new parameter to dynamically modify the source URL. By using parameters, similar information corresponding to different products can be extracted by manipulating the URL source.
web-scrapping-sentiment-analysis-power-bi

web-scrapping-sentiment-analysis-power-bi

 

Sentiment Analysis Report

After enabling the parameters and replacing it with the source URL, a PowerBI report can be built to analyse the sentiment scores obtained over the product reviews. The primary results of the sentiment analysis that determines whether a product is preferred by the customer are indicated in a gauge. The range of the gauge varies from 0 to 1 indicating the sentiment score. If the average of the sentiment score is greater than 0.5 then the product has positive reviews among its customers. Review classification tells us the weightage of positive and negative reviews and the distribution of sentiment score provides a detailed summary of the information previewed in the gauge and the donut chart. Ratings out of 5 as rated by the customers are displayed in the pie chart and the feature rating which is obtained because of web scrapping is displayed in a column chart. Finally, a word cloud over the review texts that highlight the key phrases. The frequency of the words in the review texts is scoring the size of the words hence the most impacted words out of the review texts can be identified from a single glance.
web-scrapping-sentiment-analysis-power-bi

 

This simple report with basic sentiment analysis can be run across multiple products by manipulating the URL to identify if the product is preferred by the customers or not.

Learn more about Microsoft Power BI services offerings from Visual BI solutions here.

Subscribe to our Newsletter

The post Web Scrapping and Sentiment Analysis in Power BI appeared first on Visual BI Solutions.

Demystifying the XMLA Endpoint-write operations in Power BI Premium

$
0
0

Organizations need semantic models that serve as the single source of truth for the enterprise to help in intelligent decisionmaking. Power BI through XMLA endpoints provides open-platform connectivity for Power BI datasets in premium capacities at the 1500 and higher compatibility level. The powerful read-only XMLA endpoint is already generally available since 2019Using that, the rich semantic Power BI models can be used in other tools and the customer solution implementations became highly scalable. Now, Microsoft has enabled write operations, in addition, to reading in XMLA endpoint as public preview feature and this brings in more potential to Power BI in terms of capabilities.

This allows customers to leverage semantic models compatible with a wide range of tools like Visual Studio with Analysis Services projects extension, SQL Server Management Studio, PowerShell cmdlets, DAX Studio, Microsoft Excel, and other third-party toolsThese tools through read/write XMLA endpoints provides rich set of advanced semantic modelling capabilities, dataset management, debugging, and monitoring. In this blog, we will mainly discuss the write operations supported through XMLA endpoints.     

   

Supported Write Operations 

The read/write XMLA endpoint enables Visual Studio,Tabular Editor and other editors to provide additional semantic modelling capabilities supported by the Analysis Services engine, but not yet supported in Power BI Desktop. These powerful functionalities introduced to Power BI Premium datasets include: 

  • Calculation groups-which allows reducing the number of redundant measures created by grouping common measure expressions as calculation items. This way, it provides calculation reusability.  
  • Metadata translations to support multi-lingual reports. The tabular model objects can have multiple translations of its name/description. 
  • Perspectives to define more refined views of model specific to the business domains. By this, the users whneed a limited part of the model can be given a perspective instead of deterring them with complex models. This enhances user experience too. 
  • Deploying models from Visual Studio with Analysis Services projects extension’ with a rich set of semantic modelling capabilities 
  • Fine-grain refresh capabilities using SQL server management studio. 

We can see in detail how to utilize XMLA endpoints to implement the above-listed features. 

 

1. Read/write XMLA endpoint

Prerequisites 

  • Enable XMLA read/write: By default, it’s read-only for premium capacity. In the admin portal, change it to read/write in capacity settings.
    demystifying-xmla-endpoint-write-operations-power-bi-premium
  • Enhanced metadata: XMLA write operations on datasets authored in Power BI Desktop and published to a Premium workspace requires enhanced metadata. Explore about enhanced metadata in detail here. In Power BI Desktop preview features, enable the “store datasets using enhanced metadata format” option
  • Copy the Power BI premium workspace connection URL from ‘Settings’ ->’Premium’ -> ‘Workspace Connection’.

 

2. Calculation groups

Calculation groups work with explicit DAX measures. ‘DiscourageImplicitMeasures’ model property should be set ‘true’ to create calculation groups. Here we are using calculation groups for reusable time intelligence calculations. In Visual Studio, create ‘Timecalgroup’ Calculation group and its individual calculation items and edit its DAX formula accordingly.
demystifying-xmla-endpoint-write-operations-power-bi-premium

Calculation groups can be used after deploying to the workspace. We can reuse it with any DAX measure to get its time intelligence calculations. We can see the calculation group in action below for ‘sales’ DAX measure.

demystifying-xmla-endpoint-write-operations-power-bi-premium

Time Calculation Group

 

3. Metadata translations

Multiple translated strings can be given to objects like table, columns, measures etc. for its name and description. In Visual Studio, we can create, manage, and import multiple translations based on the requirement. We can see the metadata translations in action below for Spanish, French and Portuguese languages.

demystifying-xmla-endpoint-write-operations-power-bi-premium

Meta Data Translations

 

4. Perspectives

We can create perspectives of a group of fields from our model for each department in the organization to make the work focused. In Visual Studio, select create and manage perspectives.

demystifying-xmla-endpoint-write-operations-power-bi-premium

Manage Perspective

Select the required fields and deploy it. We can see how to access the created perspectives in Power BI below.

demystifying-xmla-endpoint-write-operations-power-bi-premium

Accessing perspectives in Power BI

 

5. Fine-grain refresh capabilities in SQL server management studio

We can access the Power BI premium datasets from SQL server management studio by connecting to its workspace URL. With this, we can do an incremental refresh for specific partitions alone by selecting it based on need whereas in Power BI workspace only refresh for the entire set of partitions is possible. Refresh operations through the XMLA endpoint are not limited to 48 refreshes per day, and the scheduled refresh timeout is not imposed. The incremental refresh by partitions is in action below.
demystifying-xmla-endpoint-write-operations-power-bi-premium

Thus, with XMLA read/write enabled endpoint, Power BI Premium datasets have more parity with Azure Analysis Services enterprise-grade tabular modeling tools and processes. By this, the Power BI platform is nearly converging both enterprise and self-service BI in a single powerful platform.

Learn more about Microsoft Power BI services offerings from Visual BI solutions here.

Subscribe to our Newsletter

The post Demystifying the XMLA Endpoint-write operations in Power BI Premium appeared first on Visual BI Solutions.

Efficient Project Management in Power BI with the Latest Gantt Chart Edition v1.1.6

$
0
0

 

This blog is part of the Gantt chart Blog Series.

We are proud to announce the latest release of our xViz  Gantt chart (V 1.1.6). Following is a quick summary of exciting new features added to as part of this release. You can also view the recent Webinar –  How to Manage and Track Projects in Power BI using xViz Gantt Chart to get a quick understanding of how to configure or use these new features.

 

Latest Features – Summary

  1. Support for Ragged hierarchies (Filter blank)
  2. Fiscal Year
  3. Display Values for Parent nodes
  4. Conditional Formatting for a Parent node and Progress Bars
  5. Multiple Reference Lines
  6. Reference Range
  7. Web URLs
  8. Multiple connector lines configuration
  9. Data label enhancements
  10. Write back scenario
  11. Display Totals
  12. Date Timeline formats
  13. Zoom Levels
  14. Row numbering
  15. Locale support

Latest Features – Description

 

1. Support for Ragged hierarchies – Filter blank
One of the typical scenarios you come across in Gantt charts is ragged(uneven) hierarchies where you find certain Top nodes much deeper than the other nodes. The xViz Gantt chart now provides the ability to hide blank nodes which appear in case of ragged hierarchies.efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

2. Fiscal Year
You can now configure different Fiscal year start based on your enterprise needs. As seen in the below example, both charts have a different fiscal year start, one starts in January and the other begins in April.efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

 

3. Display Values for Parent nodes
Values assigned to the ‘Display Measure’ field can now be displayed for Parent nodes as well. You can choose from different aggregation options – weight avg, average, and Sum under the Number Formatting tab to define the aggregation logic for parent nodes.efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

4. Conditional Formatting
The Conditional formatting Tab has been redesigning with new features and controls listed below for better user experience and project management.

  • Conditional Formatting for the Parent node
    Unlike earlier release, you can choose between defining the conditional formatting rule based on parent, child node, or both nodes together. Setting Conditional Formatting on parent nodes makes it easier for viewers to visually highlight tasks where individual sub-tasks are off track and require their attention.efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6
  • Conditional Formatting for Progress bars- Track, fill and both
    Choose between different coloring options to highlight tasks based on conditional formatting rules. The new ‘Fill’ coloring option makes it easier to highlight a bar where there are no progress values (tasks which are yet to start or delayed)efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

 

5. Reference Lines
The Reference Lines feature has been moved to advance Editor Tab and has been completely redesigned to support Multiple reference lines and Reference Range -highlight a range on the Timeline

    • Multiple reference lines – The xViz Gantt provides users with the option to set multiple reference lines based on different logic –
        • Today’s Date
        • First of Date
        • Last of Date
        • Custom Date

efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

 

  • Reference Range – Use the reference range option to highlight a specific range on your timelineefficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

 

6. Web URLs
The Web URL Tab under Advance Editor provides different display options for long Web URLs which are as follows:

  • Hyperlink Text
  • Hyperlink Icon
  • Web URL link

efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

 

7. Milestone Moved to Advance Editor
The Milestone section has been moved to Advanced Editor to support some new Milestone customization optionsefficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

 

8. Multiple connector lines configuration
Unlike the previous xViz Gantt chart version, where you had to define a secondary connector type for more than one connector line display. Now you can use the primary connector field alone to display more than one connector line. For this, you would need to define the connector to field in the following format –
[“Task1”, “Task 2”] or if you have an ID field assigned to ID tab then you could also use ID – [Task ID, Task n ID]efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

 

9. Data label Enhancements
To address data label cut off issues for long data labels, a couple of new properties have been added which are as follows:

      • Data Label Alignment – choose from right, left and center
      • Offset – For better horizontal and vertical alignment
      • Padding – alter Trimline padding to handle long labels

efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

 

10. Write back scenario
Along with the ability to make live edits to your Gantt chart on runtime, it now supports the Writeback option as well for enhanced project management. This way you write these new changes back to the underlying system. For more information on Write back for xViz Gantt Chart please refer to this blog

Note: You would need to request a custom build for the write-back feature which is part of only Gantt Enterprise offering.

 

11. Display Totals
This property provides an Overall Total node overall parent and child nodes.efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

 

12. Date Timeline formats
The xViz Gantt chart now provides different date formats for your timeline which are as follows:

  • Default – e.g. Q1, Q2
  • Combined – Displays combined time ranges together e.g. Q1-Q2, Q2-Q3
  • Days – Displays the number of days in each time range e.g. Q1: 90days, Q2: 91 days

efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

 

13. Zoom Levels
The zoom levels property under the advance editor tab provides the option to define the timeline levels you see while zoom in and zoom out the operation. You can at max set 9 different zoom levels and define different Top, middle and bottom levels based on user preference. As seen in the below, you can also skip levels – Month level is skipped and the bottom-most level appears as a week.
efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

 

14. Row numbering (Enable ID)
As the name suggests, it provides a row number and can be found under the Display Categories Tab.efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

15. Locale support

The xViz Gantt chart provides language support for the following languages –

  • German
  • Spanish
  • French
  • Japanese
  • Portuguese
  • Russian
  • Turkish
  • Chinese

efficient-Project-Management-in-Power-BI-with-the-Latest-Gantt-Chart-Edition-v1.1.6

 

***

 xViz Gantt Enterprise is no longer included in the xViz Pro Suite and is a separate product offering. Read more about the Gantt Visual Configuration, Key Features, and Use Cases here.

 This blog has been originally published on the xViz website. Click here to visit and know more about Advanced visuals for Microsoft Power BI.

Subscribe to our Newsletter

The post Efficient Project Management in Power BI with the Latest Gantt Chart Edition v1.1.6 appeared first on Visual BI Solutions.

Advanced Trellis (Small Multiples) – Key Features of Power BI Visual

$
0
0

The xViz Advanced Trellis chart(aka small multiples) is of a group of similar charts having a common axis and value fields split across a category. For example, you can compare the same Product Sales across different countries, where each country is a separate chart. They are a great choice when it comes to analyzing trends across a category in a single view.

It is a common chart type found in BI tools like Power BI, Tableau, Qlik, Spotfire, etc with a variety of different names like Small Multiples, Trellis, or grid charts. It is best used for scenarios where you want to analyze multiple categories (3 or more) across a given set of values (1-4) in a single view.

Key Features – Summary

  1. Trellis Type – Split Panels, Stacked charts, and Animated Charts
  2. Series Styling and Customization
  3. Ranking (Top N)
  4. Pagination
  5. Layout
  6. Average Panel
  7. Data Label customization
  8. Axis label customization
  9. Advanced conditional formatting
  10. Interaction options
  11. Drill down support with Breadcrumb

Key Features – Description

1. Trellis Type
The Trellis Type property gets activated when you have a category field assigned to a ‘Group by’ well. There are 3 different modes for Trellis:

  • Split Panels
    As the name suggests during this mode, the Trellis category field is split further into additional panels by the Group by category field. In the below example, Split panels mode is enabled to view Sales across the Product Category/ sub-category combination.advanced-trellis-small-multiples-key-features-power-bi-visual
  • Stacked Charts
    The xViz Advanced Trellis apart from supporting single and multi-series chart options also supports stacked charts. You can display stacked column, bar and area charts using this mode where the ‘Group By’ field acts as a legend for the stacked charts.advanced-trellis-small-multiples-key-features-power-bi-visual
  • Animated Charts
    The Animated Chart mode helps you visualize the change in trends across the assigned category field. You can define the layout and speed at which you want to loop through different visuals. The xViz Advanced Trellis chart has 2 different animated chart control options:
  • Buttonadvanced-trellis-small-multiples-key-features-power-bi-visual
  • Slideradvanced-trellis-small-multiples-key-features-power-bi-visual

2. Series Styling and Customization
The xViz Advanced Trellis Chart provides extensive series styling and customization options. It offers 4 basic charts types that can be customized to create 10+ charts types which are as follows-

  • Column and Bar
  • Line, Spline and Stepped Line Chart
  • Area, Spline Area and Stepped Area Chart
  • Lollipop Chart
  • Combo Charts
  • Stacked charts – Column, bar, and area
    advanced-trellis-small-multiples-key-features-power-bi-visual

3. Utility Menu
Represented by a gear icon (image), the Utility Menu provides a set of key runtime features for end-users. Following are the list of features offered by Utility Menu:

    • Ranking [Top N] 
      The ranking is one of the most commonly requested features in the data visualization world. It helps users sort information based on importance and quickly understand the overall trend and business health. Users can use the Advanced Trellis ranking feature and focus on the Top/ Bottom N panels while bucketing the remaining panels in the ‘Others’ bucket.advanced-trellis-small-multiples-key-features-power-bi-visual
    • Pagination
      The Pagination feature helps you easily navigate the Trellis and lets you split a large number of panels into smaller chunks for easier consumption of data. You can choose from different page controls, which let you span across different pages or even jump directly to the last or first page of the Trellis.
      advanced-trellis-small-multiples-key-features-power-bi-visual
    • Layout
      The Advanced Trellis is intelligent enough(Auto mode) to suggest you the best panel layout based on the assigned data. The runtime feature makes it easily accessible to fix or change the panel layout on the fly.
      advanced-trellis-small-multiples-key-features-power-bi-visual

4.Averages

By displaying averages in the chart one can easily compare how the performance of each item, whether it at par or below average. The Average Type property helps display average values across the Trellis or within a Panel (Average Datapoint).

advanced-trellis-small-multiples-key-features-power-bi-visual

Average Datapoint

advanced-trellis-small-multiples-key-features-power-bi-visual

Average Panel

5. Data Label Customization
The xViz Advanced Trellis chart provides unique data label customizations options, which make it more suitable for displaying small multiples. The different data label formats are as follows:

  • Max & Min
  • Start & End
  • Limited datapoints – Max & Min + Start & End
  • All Datapoints
    advanced-trellis-small-multiples-key-features-power-bi-visual

6. Axis Label Customization
Quite often, we come across scenarios where the axis labels tend to overlap each other and get cut off due to limited real estate or long labels, thereby making it hard to read the chart. To overcome these issues, the xViz Advanced Trellis similar to xViz Multiple Axis charts provides various Axis label customization options which are as follows:

  • Word wrap
  • Nth label – Skip labels to declutter
  • Stepped labels – Display Labels in Steps format to accommodate long labels
  • Rotate Label- Rotate labels to the desired angle or better viewing
    advanced-trellis-small-multiples-key-features-power-bi-visual

7. Advanced Conditional Formatting
Visually highlight outliers based on different business rules.
advanced-trellis-small-multiples-key-features-power-bi-visual

8. Interaction Options

Enlarge Panels

You can click on the enlarge panel option to zoom into panels for better visibility
advanced-trellis-small-multiples-key-features-power-bi-visual

Interactive legends
Just like other xViz charts, the trellis legends are interactive which on click enabled/ disable the selected series to provide a better viewing experience.
advanced-trellis-small-multiples-key-features-power-bi-visual

9. Drilldown Support with Breadcrumb
The Breadcrumb helps in understanding the levels traversed to reach the current state while drilling down. In the below example, we can see that the Breadcrumb provides a clear overview that the user has drilled down on Telephones and Communications followed by Technology and is now viewing the Product Category level.
advanced-trellis-small-multiples-key-features-power-bi-visual

***

You can check out the other custom visuals in the xViz Pro Suite here. Check out videos and all the other resources for xViz Advanced Trellis here.

This blog has been originally published on the xViz website. Click here to visit and know more about Advanced visuals for Microsoft Power BI.

Subscribe to our Newsletter

The post Advanced Trellis (Small Multiples) – Key Features of Power BI Visual appeared first on Visual BI Solutions.


Web Scraping and Sentiment Analysis in Power BI

$
0
0

In any e-commerce website, the overall rating allocated to a product is based on the individual reviews by the customer. Lately, these Ratings/Reviews have become a key factor in determining whether a customer will buy a product or not.  Using built-in features of Power BI like AI Insight, sentiment analysis and web scraping reviews along with the product specification can be managed, studied and analysed.  Helping a great deal with the decision-making process.

In this blog, we are going to dynamically retrieve the review of mobile phones from Amazon and run sentiment analysis to understand if the product is preferred by the customers.

web-scrapping-sentiment-analysis-power-bi

Note:
Before proceeding with web scraping and any AI analysis ensure New web table interface and AI insights functions browser in preview features under options are enabled.

 

Web scraping product details

By Selecting the data source as Web, PowerBI simplifies retrieving contents from any webpage provided we have its corresponding URL. Since we are interested in retrieving the mobile phone reviews and its details, make a simple search of the required model of mobile phone in Amazon, retrieve its URL and enter it in the dialogue box that pops up when you select web as your data source.
web-scrapping-sentiment-analysis-power-bi

When you click ok, a connection to the webpage is established and the entire contents corresponding to the URL will be pulled into PBI as an HTML source. As a result, tables in the HTML will be available as a data source

For example, Table 5 displays ratings of the item:
web-scrapping-sentiment-analysis-power-bi  web-scrapping-sentiment-analysis-power-bi

 

However, we are interested in collecting reviews of customers. These reviews are not available as HTML tables but stored under a certain HTML class. To retrieve the data, PowerBI provides the ability to Add Tables using Examples.
web-scrapping-sentiment-analysis-power-bi

 

When we select this option, we will be able to add new columns and generate values for it based on the sample data. Using this sample data, the values that map to the subsequent data hooks under the same class are automatically retrieved by Power BI. By adding a new column Reviews and including some sample of data, all the other subsequent data gets automatically filled by Power BI.
web-scrapping-sentiment-analysis-power-bi

 

Likewise, we add two more column, Ratings and Review Text that includes the comments given by the user of each review. The same process can be repeated to extract more contents from the website.
web-scrapping-sentiment-analysis-power-bi

Sentiment Analysis

Once we have the source tables ready, the next step is to apply sentiment analysis over the contents web scraped from Amazon. The Text Analytics associated with the AI Insights feature of PowerBI allows its users to build the sentiment analysis model. It utilizes Azure Cognitive Services to obtain the sentiment score.

Sentiment analysis considers a text input and runs a machine learning algorithm which assigns a score ranging from 0 to 1. A score of 0 indicates negative sentiment and 1 indicates positive sentiment. The model in cognitive services is trained with a dictionary of texts that are mapped to its corresponding sentiments. This training model is considered to assign the sentiment score for the reviews extracted.

The scores are generated in a new column which can be classified into positive (> 0.5), negative (<0.5) and neutral (=0.5).

 

Enabling Parameters to the URL

As the final step, create a new parameter to dynamically modify the source URL. By using parameters, similar information corresponding to different products can be extracted by manipulating the URL source.
web-scrapping-sentiment-analysis-power-bi

web-scrapping-sentiment-analysis-power-bi

 

Sentiment Analysis Report

After enabling the parameters and replacing it with the source URL, a PowerBI report can be built to analyse the sentiment scores obtained over the product reviews. The primary results of the sentiment analysis that determines whether a product is preferred by the customer are indicated in a gauge. The range of the gauge varies from 0 to 1 indicating the sentiment score. If the average of the sentiment score is greater than 0.5 then the product has positive reviews among its customers. Review classification tells us the weightage of positive and negative reviews and the distribution of sentiment score provides a detailed summary of the information previewed in the gauge and the donut chart. Ratings out of 5 as rated by the customers are displayed in the pie chart and the feature rating which is obtained because of web scraping is displayed in a column chart. Finally, a word cloud over the review texts that highlight the key phrases. The frequency of the words in the review texts is scoring the size of the words hence the most impacted words out of the review texts can be identified from a single glance.
web-scrapping-sentiment-analysis-power-bi

 

This simple report with basic sentiment analysis can be run across multiple products by manipulating the URL to identify if the product is preferred by the customers or not.

Learn more about Microsoft Power BI services offerings from Visual BI solutions here.

Subscribe to our Newsletter

The post Web Scraping and Sentiment Analysis in Power BI appeared first on Visual BI Solutions.

Demystifying the XMLA Endpoint-write operations in Power BI Premium

$
0
0

Organizations need semantic models that serve as the single source of truth for the enterprise to help in intelligent decisionmaking. Power BI through XMLA endpoints provides open-platform connectivity for Power BI datasets in premium capacities at the 1500 and higher compatibility level. The powerful read-only XMLA endpoint is already generally available since 2019Using that, the rich semantic Power BI models can be used in other tools and the customer solution implementations became highly scalable. Now, Microsoft has enabled write operations, in addition, to reading in XMLA endpoint as public preview feature and this brings in more potential to Power BI in terms of capabilities.

This allows customers to leverage semantic models compatible with a wide range of tools like Visual Studio with Analysis Services projects extension, SQL Server Management Studio, PowerShell cmdlets, DAX Studio, Microsoft Excel, and other third-party toolsThese tools through read/write XMLA endpoints provides rich set of advanced semantic modelling capabilities, dataset management, debugging, and monitoring. In this blog, we will mainly discuss the write operations supported through XMLA endpoints.     

   

Supported Write Operations 

The read/write XMLA endpoint enables Visual Studio,Tabular Editor and other editors to provide additional semantic modelling capabilities supported by the Analysis Services engine, but not yet supported in Power BI Desktop. These powerful functionalities introduced to Power BI Premium datasets include: 

  • Calculation groups-which allows reducing the number of redundant measures created by grouping common measure expressions as calculation items. This way, it provides calculation reusability.  
  • Metadata translations to support multi-lingual reports. The tabular model objects can have multiple translations of its name/description. 
  • Perspectives to define more refined views of model specific to the business domains. By this, the users whneed a limited part of the model can be given a perspective instead of deterring them with complex models. This enhances user experience too. 
  • Deploying models from Visual Studio with Analysis Services projects extension’ with a rich set of semantic modelling capabilities 
  • Fine-grain refresh capabilities using SQL server management studio. 

We can see in detail how to utilize XMLA endpoints to implement the above-listed features. 

 

1. Read/write XMLA endpoint

Prerequisites 

  • Enable XMLA read/write: By default, it’s read-only for premium capacity. In the admin portal, change it to read/write in capacity settings.
    demystifying-xmla-endpoint-write-operations-power-bi-premium
  • Enhanced metadata: XMLA write operations on datasets authored in Power BI Desktop and published to a Premium workspace requires enhanced metadata. Explore about enhanced metadata in detail here. In Power BI Desktop preview features, enable the “store datasets using enhanced metadata format” option
  • Copy the Power BI premium workspace connection URL from ‘Settings’ ->’Premium’ -> ‘Workspace Connection’.

 

2. Calculation groups

Calculation groups work with explicit DAX measures. ‘DiscourageImplicitMeasures’ model property should be set ‘true’ to create calculation groups. Here we are using calculation groups for reusable time intelligence calculations. In Visual Studio, create ‘Timecalgroup’ Calculation group and its individual calculation items and edit its DAX formula accordingly.
demystifying-xmla-endpoint-write-operations-power-bi-premium

Calculation groups can be used after deploying to the workspace. We can reuse it with any DAX measure to get its time intelligence calculations. We can see the calculation group in action below for ‘sales’ DAX measure.

demystifying-xmla-endpoint-write-operations-power-bi-premium

Time Calculation Group

 

3. Metadata translations

Multiple translated strings can be given to objects like table, columns, measures etc. for its name and description. In Visual Studio, we can create, manage, and import multiple translations based on the requirement. We can see the metadata translations in action below for Spanish, French and Portuguese languages.

demystifying-xmla-endpoint-write-operations-power-bi-premium

Meta Data Translations

 

4. Perspectives

We can create perspectives of a group of fields from our model for each department in the organization to make the work focused. In Visual Studio, select create and manage perspectives.

demystifying-xmla-endpoint-write-operations-power-bi-premium

Manage Perspective

Select the required fields and deploy it. We can see how to access the created perspectives in Power BI below.

demystifying-xmla-endpoint-write-operations-power-bi-premium

Accessing perspectives in Power BI

 

5. Fine-grain refresh capabilities in SQL server management studio

We can access the Power BI premium datasets from SQL server management studio by connecting to its workspace URL. With this, we can do an incremental refresh for specific partitions alone by selecting it based on need whereas in Power BI workspace only refresh for the entire set of partitions is possible. Refresh operations through the XMLA endpoint are not limited to 48 refreshes per day, and the scheduled refresh timeout is not imposed. The incremental refresh by partitions is in action below.
demystifying-xmla-endpoint-write-operations-power-bi-premium

Thus, with XMLA read/write enabled endpoint, Power BI Premium datasets have more parity with Azure Analysis Services enterprise-grade tabular modeling tools and processes. By this, the Power BI platform is nearly converging both enterprise and self-service BI in a single powerful platform.

Learn more about Microsoft Power BI services offerings from Visual BI solutions here.

Subscribe to our Newsletter

The post Demystifying the XMLA Endpoint-write operations in Power BI Premium appeared first on Visual BI Solutions.

Connecting Python with Tableau

$
0
0

In this Internet era, the amount of data being generated from various sources with detailed granularity is humongous. Data being the new gold, it has become inevitable to use data for advanced analytics purposes to understand useful business insights for the betterment of one’s organization. Python and Tableau are among the best technologies addressing analytics challenges. This blog briefs about both the tools and walks us through the steps to connect to Python from Tableau. Before discussing Tableau and Python integration let us discuss Tableau and Python.

 

Python

Python is an object-oriented, open-source, high-level programming language with dynamic semantics. Though Python has many high-level data structures it is very user-friendly and simple to learn. Python also supports packages, modules which are useful to increase program modularity and allows code to be integrated and reused with other technologies.

Tableau

Tableau is a self-service data visualization tool that helps customers to view and present data in the form of interactive dashboards and charts to showcase insights and perform real-time data analytics. Tableau is very user-friendly as it provides a drag-drop user interface to visualize the available data with minimal scripting required only for calculated fields.

Prerequisites

Connecting Python to Tableau requires us to install  Python IDE(Anaconda Navigator) which is also an open-source tool.

How to integrate Python with Tableau?

TabPy is the API that enables the working of Python code from within a Tableau workbook.

Steps to integrate Python with Tableau:

1. After downloading Anaconda Navigator, the next step would be to download the Tabpy server. TabPy server can be downloaded by typing conda install -c anaconda TabPy-server in the anaconda prompt.
connecting-python-tableau

After all the packages are installed it will ask for yes or no to proceed, press y to install the server.
connecting-python-tableau

2. After the TabPy server is installed, the server should be started in order to connect with Tableau. To start the TabPy-server we should change the directory from the root directory to the folder where TabPy-server is installed. This can be done by typing cd C:\Users\*your username*\Anaconda3\pkgs\tabpy-server-0.2-py37_1\Lib\site-packages\tabpy_server command in anaconda prompt. This command changes directory to the folder where tabpy_server is installed.

3. The next command startup.bat can be typed to start the server.
connecting-python-tableau

4. After the command startup.bat, the prompt displays port number 9004 on which TabPy-server is initialized. After initializing the server, the next part is to connect the server with Tableau. Open Tableau desktop.

5. In Tableau go to:

  • Open Help menu.
  • In that choose settings and performance
  • From settings and performance choose to manage the external connection.
  • Select TabPy/External API
  • Select localhost
  • Make sure the port number is 9004

Click on test connection to cross-check the connectivity.
connecting-python-tableau

Once the connection is successful then click on the OK button to approve the external connection.

 

Why Python+Tableau?

When Tabpy is used with Tableau, calculated fields can be defined in Python which enables us to use the power of many machine-learning libraries right from Tableau visualizations. It enables many new features like Machine learning predictions, sentimental analysis, and time series forecasting using various models by customizing calculated fields.

 

Limitations of integrating Python with Tableau

Though there are many advantages of enabling Tabpy there are also certain limitations.

  • When a large dataset is used then waiting time will be more while the script runs each time you make a change to the view.
  • The Python generated calculated fields will not be extracted if you create a Tableau extract.
  • The Python script will run only when you put it into the view.
  • You cannot use the Tabpy calculations to create values and base additional calculations on those values unless you can use both calculated fields in the view.

When deployed together, Python integrated with Tableau can help in delivering scalable, flexible and advanced analytics platform.

To learn more about Visual BI’s Tableau Consulting & End User Training Programs, contact us here.

Subscribe to our Newsletter

The post Connecting Python with Tableau appeared first on Visual BI Solutions.

Utilizing Azure Log Analytics Workspace for Azure Storage Account Logs

$
0
0

Logging is a crucial administrative task as it helps identify details of an event that occurred. Logs usually store the details such as the username, time, action of the user and metadata of the event. It is helpful for auditing and for forensic examination in the event of a crime.   

Azure provides various monitoring tools that help identify resource usage and bottlenecks. Azure Storage provides logging feature that gives information on the events that occurred on the storage account.

 

Azure Log Analytics Workspace 

Log analytics workspace is a service provided in Azure that enables us to collect logs from multiple services like an Azure Storage account and Azure Virtual Machines. The logs collected based on events can then be queried using a custom language called KQL (Kusto Query Language)KQL also is known as ‘Log Analytics Query language’ is like SQL with the additional capability to render charts.
utilizing-azure-log-analytics-workspace-azure-storage-account-logs utilizing-azure-log-analytics-workspace-azure-storage-account-logs

You can add various types of events for loading into the Log Analytics workspace, and then combine it in the dashboard tiles.

Thus, Log Analytics Workspace provides a single place where you can store logs from different services, query them and build a dashboard from it.

Azure Storage Account

Azure Storage Account provides a storage platform on the cloud enabling us to store various kinds of data. Data can be stored as blobs, tables or queues. Lots of read/write/delete operations usually occur on the storage, and you might need to keep track of who is doing what.

To enable logging on an Azure Storage account, open the respective storage account. Go to Monitoring (classic) – > Diagnostic Settings (classic), select the version and check the operations you need to log (read/write/delete).
utilizing-azure-log-analytics-workspace-azure-storage-account-logs

Azure Storage provides two versions of logging and v2.0 is just a superset of the v1.0. The log generated contains the following details -> resource ids, request type, the operation performed, operation status and network information like header size, authentication. v2.0 contains more details with respect to Unique IDs (UUIDs) of all the entities tied to the event.

The logs generated can be seen in the Azure Storage explorer under ‘$logs’ in the corresponding storage account. At the time of publishing this blog, ‘$logs’ is not visible in the preview version of Azure Storage Explorer in the Azure portal.

The log files are organized in a year/month/day folder structure and the file contents are ‘;’ separated. The log files can be downloaded and analyzed in your favorite tool or can be automatically imported into the Log Analytics workspace.

Loading log files into Log Analytics Workspace

At the time of publishing this blog, there is no direct way to connect ‘$logs’ to the analytics workspace. Microsoft has provided a PowerShell script that can be run to fetch logs and post them in the workspace.

https://github.com/Azure/azure-docs-powershell-samples/blob/master/storage/post-storage-logs-to-log-analytics/PostStorageLogs2LogAnalytics.ps1

Steps to load data:

  1. Download the PowerShell program from the link provided above. Using ‘Powershell ISE’ to run the program is recommended.
  2. Insert your respective ids at the top of the program. The details of getting the ids have been provided in the comments on top of each variable.
  3. ‘$LogType’ is the name of the table that will be created for the logs from this storage account. The table will also append ‘_CL’ during creation.
    utilizing-azure-log-analytics-workspace-azure-storage-account-logs
  4. Once you have inserted the required fields, run the program and it should import all the logs to the workspace. You can automate this using the Azure DevOps.

Querying Log Analytics Workspace

Once the logs are imported, open the Log Analytics workspace, select ‘Logs’ in the left pane and you should see your logs under the Custom Logs hierarchy. To query, you need to use the KQL (Kusto Query Language) which is like SQL.

Consider gen2_logs_CL is my custom log table and I need to select Operation_Type. In SQL, we would write it as below:

SELECT Operation_Type FROM gen2_logs_CL

In KQL:

gen2_logs_CL | project Operation_Type

In the below image, we have grouped the Operation type and created a pie chart to see which operations are the most common in the storage account. The render command is specific to KQL and is used to produce a chart from the output of the query.
utilizing-azure-log-analytics-workspace-azure-storage-account-logs

Dashboards in Log Analytics workspace allow us to add the various queries we create across different services to be added in a single place. This allows us to get a quick look at the logs of all the services.

After creating this, you can add the pie chart to a new dashboard or an existing dashboard within the Log Analytics Workspace which will automatically update every time the custom log table is updated.

Thus, by using Azure Storage Analytics and Log analytics workspace, we can derive useful insights into the events that happen in the Azure Storage Account.

Reach out to us for implementation details or questions. Learn more about Visual BI’s Microsoft Azure offerings here.

 

Subscribe to our Newsletter

The post Utilizing Azure Log Analytics Workspace for Azure Storage Account Logs appeared first on Visual BI Solutions.

Web Scraping and Sentiment Analysis in Power BI

$
0
0

In any e-commerce website, the overall rating allocated to a product is based on the individual reviews by the customer. Lately, these Ratings/Reviews have become a key factor in determining whether a customer will buy a product or not.  Using built-in features of Power BI like AI Insight, sentiment analysis and web scraping reviews along with the product specification can be managed, studied and analysed.  Helping a great deal with the decision-making process.

In this blog, we are going to dynamically retrieve the review of mobile phones from Amazon and run sentiment analysis to understand if the product is preferred by the customers.

web-scrapping-sentiment-analysis-power-bi

Note:
Before proceeding with web scraping and any AI analysis ensure New web table interface and AI insights functions browser in preview features under options are enabled.

 

Web scraping product details

By Selecting the data source as Web, PowerBI simplifies retrieving contents from any webpage provided we have its corresponding URL. Since we are interested in retrieving the mobile phone reviews and its details, make a simple search of the required model of mobile phone in Amazon, retrieve its URL and enter it in the dialogue box that pops up when you select web as your data source.
web-scrapping-sentiment-analysis-power-bi

When you click ok, a connection to the webpage is established and the entire contents corresponding to the URL will be pulled into PBI as an HTML source. As a result, tables in the HTML will be available as a data source

For example, Table 5 displays ratings of the item:
web-scrapping-sentiment-analysis-power-bi  web-scrapping-sentiment-analysis-power-bi

 

However, we are interested in collecting reviews of customers. These reviews are not available as HTML tables but stored under a certain HTML class. To retrieve the data, PowerBI provides the ability to Add Tables using Examples.
web-scrapping-sentiment-analysis-power-bi

 

When we select this option, we will be able to add new columns and generate values for it based on the sample data. Using this sample data, the values that map to the subsequent data hooks under the same class are automatically retrieved by Power BI. By adding a new column Reviews and including some sample of data, all the other subsequent data gets automatically filled by Power BI.
web-scrapping-sentiment-analysis-power-bi

 

Likewise, we add two more column, Ratings and Review Text that includes the comments given by the user of each review. The same process can be repeated to extract more contents from the website.
web-scrapping-sentiment-analysis-power-bi

Sentiment Analysis

Once we have the source tables ready, the next step is to apply sentiment analysis over the contents web scraped from Amazon. The Text Analytics associated with the AI Insights feature of PowerBI allows its users to build the sentiment analysis model. It utilizes Azure Cognitive Services to obtain the sentiment score.

Sentiment analysis considers a text input and runs a machine learning algorithm which assigns a score ranging from 0 to 1. A score of 0 indicates negative sentiment and 1 indicates positive sentiment. The model in cognitive services is trained with a dictionary of texts that are mapped to its corresponding sentiments. This training model is considered to assign the sentiment score for the reviews extracted.

The scores are generated in a new column which can be classified into positive (> 0.5), negative (<0.5) and neutral (=0.5).

 

Enabling Parameters to the URL

As the final step, create a new parameter to dynamically modify the source URL. By using parameters, similar information corresponding to different products can be extracted by manipulating the URL source.
web-scrapping-sentiment-analysis-power-bi

web-scrapping-sentiment-analysis-power-bi

 

Sentiment Analysis Report

After enabling the parameters and replacing it with the source URL, a PowerBI report can be built to analyse the sentiment scores obtained over the product reviews. The primary results of the sentiment analysis that determines whether a product is preferred by the customer are indicated in a gauge. The range of the gauge varies from 0 to 1 indicating the sentiment score. If the average of the sentiment score is greater than 0.5 then the product has positive reviews among its customers. Review classification tells us the weightage of positive and negative reviews and the distribution of sentiment score provides a detailed summary of the information previewed in the gauge and the donut chart. Ratings out of 5 as rated by the customers are displayed in the pie chart and the feature rating which is obtained because of web scraping is displayed in a column chart. Finally, a word cloud over the review texts that highlight the key phrases. The frequency of the words in the review texts is scoring the size of the words hence the most impacted words out of the review texts can be identified from a single glance.
web-scrapping-sentiment-analysis-power-bi

 

This simple report with basic sentiment analysis can be run across multiple products by manipulating the URL to identify if the product is preferred by the customers or not.

Learn more about Microsoft Power BI services offerings from Visual BI solutions here.

Subscribe to our Newsletter

The post Web Scraping and Sentiment Analysis in Power BI appeared first on Visual BI Solutions.

Viewing all 989 articles
Browse latest View live