R with PowerBI – A step by step guide approach

A lot of interests are visible everywhere how to integrate R scripts with Microsoft PowerBI dashboards. Here goes a step by step guidance on this.

Lets assume, you have some couple of readymade R code available, for example , with ggplot2 library. Lets find the following scripts performing analytics using CHOL data.

  1. Open R studio or R Package (CRAN) & install ggplot2 library first.

  2. Paste the following R script & execute it.

chol <- read.table(url(“http://assets.datacamp.com/blog_assets/chol.txt&#8221;), header = TRUE)
#Take the column “AGE” from the “chol” dataset and make a histogram it
qplot(chol$AGE , geom = “histogram”)
ggplot(data-chol, aes(chol$AGE)) + geom_histogram()

you should be able to see the visuals output like this.


3. Next, execute the following pieces of R code to find out the binwidth argument using ‘qplot()‘ function.

geom = “histogram”,
binwidth = 0.5)


4. Lets take help of hist() function in R.

#Lets take help from hist() function
binwidth = 0.5,
main = “Histogram for Age”,
xlab = “Age”,


5. Now, add I() function where nested  color.

#Add col argument, I() function where nested color.
binwidth = 0.5,
main = “Histogram for Age”,
xlab = “Age”,

I func.JPG

6. Next, adjust ggplot2 little by the following code.

#Adjusting ggplot
ggplot(data=chol, aes(chol$AGE)) +
geom_histogram(breaks=seq(20, 50, by = 2),
alpha = .2) +
labs(title=”Histogram for Age”) +
labs(x=”Age”, y=”Count”) +
xlim(c(18,52)) +


7. Plot a bar graph with this following code.

#Plotting Bar Graph
binwidth = 0.5,
main = “Bar Graph for Mort”,
xlab = “Mort”,


8. Next, open PowerBI desktop tool. You can download it free from this link. Now, click on Get Data tab to start exploring & connect with R dataset. Rscript.JPG

If you already have R installed in the same system building PowerBI visuals , you just need to paste the R scripts next in the code pen otherwise , you need to install R in the system where you are using the PowerBI desktop like this.


9. Next, you can also choose the ‘custom R visual’ in PowerBI desktop visualizations & provide the required R scripts to build visuals & finally click ‘Run’.



10. Build all the R function visuals by following the same steps & finally save the dashboard.


11.You can refresh an R script in Power BI Desktop. When you refresh an R script, Power BI Desktop runs the R script again in the Power BI Desktop environment.




Quick Installation of Single node Datazen Server in Azure Cloud Service & Sample Dashboards

The demo provides step by step guidance on quick setup of datazen server on single node server & connecting with publisher app to build custom visuals.

Pre-requisites for the demo : 

  1. An active Azure subscription.
  2. Windows 10 store(for installation of Datazen publisher)

Detailed steps are depicted as follows:

Steps Screen Shot
1.     Go to https://manage.windowsazure.com/

2.     Login with your Live ID.

3.     Click +New.

4.     Select Compute -> Virtual Machine -> From Gallery.
(Fig. 2)

5.     Select the Windows Server 2012 R2 Datacenter image.

6.     Click the next arrow at the bottom right.



7.     Enter the required information for Virtual machine configuration.

o   Virtual machine name

o   Choose Basic or Standard Tier (recommended)

o   Choose A4 as the machine size.  A4 has 8 cores, which is the minimum number of cores for a single machine setup.

o   Enter the machine admin username/password.

o   Click the next arrow.


8.     Enter the required information for Virtual machine configuration.

o   Select option Create a new cloud service.

Make sure the Cloud Service DNS Name is available.

o   Choose the subscription it should be billed to, the region it should be deployed.

Choose the one closest to your location

o   Leave the Storage Account and Availability Set as is.

o   Add an HTTP endpoint at a minimum.

You may need to scroll to add the endpoint.

o   Click the next arrow.

9.     Select Install VM Agent and leave other unchecked.

10.  Click the checkmark to start the deployment process.

You’ll see it start the provisioning process in the list of the virtual machines you are responsible for in Azure.


11.   Wait for the status to change from Starting to Running in virtual machines.  
12.   Select your VM then click Connect.  
13.   Save the RDP file to your local machine.  
14.   Open the saved Remote Desktop Connection file, then click Connect.  
15.   Connect to the VM via Remote Desktop and enter the admin username/password. VM.png
16.   Click Yes to connect to Server  
17.   Click Configure this local server in the Server Manager dashboard that appears when you login.  
18.  Then change the IE Enhanced Security Configuration to Off for Administrators.

You can always change it back if you really want to when you’re done.

19.  Close the Server Manager.

Server Manager.png



Section 2: Install the Datazen Server

1.     Navigate to the following link and download the Datazen server software onto the VM. You may need to turn off IE Enhanced Security on the server to do so.

2.     The Datazen server files download as a zipped file. Extract all the files

3.     Open Datazen Enterprise Server.3.0.2562.

4.     Click run to start the install process.

Datazen Server.JPG
5.     In the Datazen Enterprise Server Setup click Next.  Finish-DataStudio.JPG
6.     Click Next in the Setup wizard, accepting the terms in the License Agreement and moving through each screen.


7.     Click Next on the Features page


8.     Click Next on the Core Service Credentials page.  
9.     Once you get to the Admin Password page, type a Password for the Datazen admin user.  (Fig. 20)

This doesn’t have to be the same password as you used for the server.

10.  Click Next. 

11.  On the Authentication page, leave the Authentication Mode as Default.

12.  Click Next.

13.  On the Repository Encryption page, select Copy to Clipboard, then paste the key into a Notepad file.

14.  Save the Notepad file to a safe location.


15.  Click Next. 

16.  On the Instance Id page, select Copy to Clipboard then paste the Id into a Notepad file.

17.  Save the Notepad file to a safe location.


18.  Click Next. 

19.  On the Data Acquisition Service Credentials page, leave the credentials as is, then click Next.  
20.  On the Web Applications IIS Settings page leave the default settings, then click Next.   
21.  On the Control Panel Email Settings page, leave the default values since this is a test server.

22.  Click Next. 

23.  On the Ready to Install page, click Install and wait until the installation is complete.
This might take a few minutes.

Section 3: Configure the Datazen Server

1.   Open your browser in your local machine

2.   Navigate to http://mycloudservicename.cloudapp.net/cp.

Make sure you replace the yourcloudservicename with the name of your cloud service.

3.   If you can successfully connect, you should see the Control Panel Log In screen.

4.   Enter the username admin and the password you entered in the Setup wizard, then select  Log In.

5.   You will need to create a new user to start creating dashboard hubs, since you need each hub to have an owner.  The owner can NOT be the admin user.  Click Create User to create your first user.




6.   Enter a value in the top three fields (the email address can be fake if you want) and select Create User.  
7.   You will now see a new option to Create BI Hub.  activate-user.JPG
8.   Enter Hub name whatever you’d like, but make sure you enter the username of the user you just created for owner username.

9.   Enter a maximum number of users that can connect to hub.

10.  Click Create. 

BI Hub Created.JPG
11.   Finish the creation of the hub. It will be displayed in the list of available hubs.  
12.  The new hub will also be shown in the navigation menu at the bottom left of the screen.  
13.  Click the Server Users link on the left-hand side of the screen.


Server Users.png
14.  Click Create User.   
15.  Fill in the fields under Required Info.

16.  Click Create User.

17.  You will see the user and a Set password link option next to the username.

18.  Click on Set password link and then copy the link to your clipboard.
Note: This step is only required as the email notification is not set up.

19.  Logout as the admin

20.  Open a new browser window and paste the URL to reset the password into the address bar.

You can now finish setting up that user by entering the password for the account.

21.  In the Control Panel Activate User Account screen, then enter the new password, then re-type password

22.  Click Activate My Account. (Fig. 40)

23.  Logout as this user and log back in as the admin before proceeding.



Section 4: Apply a Custom Branding

1.  To add the Wide World Importers brand package to the server, save it locally. The package is provided with this demo.

2.  Click on the Branding link on the left-hand side and upload the brand package to the server.

3.  Make sure you choose the Server to upload it to.

You will see the Server icon has the Wide World Importers branding associated.


4.  To make sure it was applied properly, open a new browser and navigate to the following URL (make sure you replace the mycloudservicename with whatever you named yours)


Your Server Login screen should look as shown on the right, now having the Wide World Importers brand package applied. (Fig. 43)

Server Login


Section 5: Connect to the Datazen Server with Publisher

1.  Open the Datazen Publisher app

If this is the first time using the app, you will have the option of connecting the Datazen demo server.  We recommend doing that, so you will have some nice demo dashboards to show immediately.

2.  To add new server Right-click in the dashboard, then click Connected. (Fig. 44)

3.  Click Add New Server Connection (Fig. 45) Demo
4.  Provide the following information to connect to a Datazen server. (Fig. 46)

Server Address: mycloudservicename.cloudapp.net
User name:
user name that created
provide user password

5.  Uncheck Use Secure Connection.

6.  Click Connect. (Fig. 46)

Datazen Server Login.png
7.  When connected, you should be able to publish dashboards to your Datazen server. (Fig. 47)

8.  You will see a nice dashboard with KPIs for Wide World Importers and Fabrikam Insurance. (Fig. 47)

Datazen Screen






Azure Stream Analytics & Machine Learning Integration With RealTime Twitter Sentiment Analytics Dashboard on PowerBI

Recently, it has been introduced the integration of ASA & AML available as preview update & it’s possible to add AML web service URL & API key as ‘custom function‘ with ASA input. In this demo, realtime tweets are collected based on keywords like ‘#HappyHolidays2016‘, ‘#MerryChristmas‘, ‘#HappyNewYear2016‘ & those are directly stored on a .csv file saved on OneDrive. Here goes the solution architecture diagram of the POC.




Now, add the Service Bus event hub endpoint as input to the ASA job, while deploy the ‘Twitter Predictive Sentiment Analytics Model‘  & click on ‘Open in Studio‘ to start deploy the model. Don’t forget to run the solution before deploying.



Once the model is deployed, open the ‘Web Service‘ dashboard page to get the model URL & API key, click on default endpoint -> download the excel 2010 or earlier apps. Collect the URL & API key to apply it to ASA function credentials for AML deployment.


Next, create an ASA job & add the event hub credentials where the real world tweets are getting pushed & click on ‘Functions‘ tab of ASA job to add the AML credentials. Provide model name, URL & API key of the model & Once, it’s added, click on Save.



Now, add the following ASA SQL to aggregate the realtime tweets sentiment scores coming out from predictive twitter sentiment model.



Provide the output as Azure Blob storage, add a container name & serialization type as CSV & start the ASA job. Also, start importing data into PowerBI desktop from the ASA output Azure blob storage account.




PowerBI desktop contains in-built power Query to start preparing the ASA output data & processing data types. Choose the AML model sentiment score datatype as decimal type & TweetTexts as Text(String) type.



Start building the ‘Twitter Sentiment Analytics‘ dashboard powered by @AzureStreaming & Azure Machine Learning API with realworld tweet streaming, there’re some cool custom visuals are available on PowerBI.  I’ve used some visuals here like ‘wordcloud‘ chart which depicts some of the highly scored positive sentiment contained tweets with most specific keywords like ‘happynewyear2016‘, ‘MerryChristmas‘,’HappyHolidays‘ etc.



While, in the donut chart, the top 10 tweets with most positive sentiment counts are portrayed with the specific sentiment scores coming from AML predictive model experiment integrated with ASA jobs.


~Wish you HappyHolidays 2016!

What’s new in Azure Data Catalog

The Azure Data Catalog (aka previously PowerBI Data Catalog) has released in public preview on last monday(July 13th) @WPC15, which typically reveals a new world of storing & connecting #Data across on-prem & azure SQL database. Lets hop into a quick jumpstart on it.

Connect through Azure Data Catalog through this url  https://www.azuredatacatalog.com/ by making sure you are logging with your official id & a valid Azure subscription. Currently , it’s free for first 50 users & upto 5000 registered data assets & in standard edition, upto 100 users & available upto 1M registered data assets.



Lets start with the signing of the official id into the portal.


Once it’s provisioned, you will be redirected to this page to launch a windows app of Azure Data Catalog.



It would start downloading the app from clickonce deployed server.



After it downloaded & would prompt to select server , at this point it has capacity to select data from SQL Server Analysis service, Reporting Service, on-prem/Azure SQL database & Oracle db.


For this demo, we used on-prem SQL server database to connect to Azure Data Catalog.


We selected here ‘AdventureWorksLT’ database & pushed total 8 tables like ‘Customer’, ‘Product’, ‘ProductCategory’, ‘ProductDescription’,’ProductModel’, ‘SalesOrderDetail’ etc. Also, you can tags to identify the datasets on data catalog portal.


Next, click on ‘REGISTER’ to register the dataset & optionally, you can include a preview of the data definition as well.



Once the object registration is done, it would allow to view on portal. Click on ‘View Portal’ to check the data catalogs.


Once you click , you would be redirected to data catalog homepage where you can search for your data by object metaname.




in the data catalog object portal, all of the registered metadata & objects would be visible with property tags.


You can also open the registered object datasets in excel to start importing into PowerBI.


Click on ‘Excel’ or ‘Excel(Top 1000)’ to start importing the data into Excel. The resultant data definition would in .odc format.



Once you open it in Excel, it would be prompted to enable custom extension. Click on ‘Enable’.


From Excel, the dataset is imported to latest Microsoft PowerBI Designer Preview app to build up a custom dashboard.


Login into https://app.powerbi.com & click to ‘File’ to get data from .pbix file.


Import the .pbix file on ‘AdventureWorks’ customer details & product analytics to powerbi reports & built up a dashboard.Uploading

The PowerBI preview portal dashboard has some updates on tile details filter like extension of custom links.



The PowerBI app for Android is available now, which is useful for quick glance of real-time analytics dashboards specially connected with Stream analytics & updating  real time.







Pushing realtime Sensors data into ASA & visualize into Near Real-Time (NRT) PowerBI dashboard– frontier of IoT

As per as the last demo on IoT foundation stuffs, we’ve seen how it’s possible to leverage the real-time data insights from social media datasets like Twitter with some keywords. In this demo, we are trying to pushing realtime sensors data from Windows Phone device to Azure Stream Analytics (through Service Bus EventHub channels) & after processing in ASA hub publishing out to realtime PowerBI dashboard or near real-time analytics(NRT) on PowerView for Excel by pushing out ASA events to Azure SQL database through Excel PowerQuery.

An overview of n-tier architecture of  ASA on IoT foundation is like this:



While, IoT always enables customers to connect their own device on Azure cloud platform & bring out some real business value from it, whether it produces #BigData or #SmallData.

Another topic is pretty important is to get insights from Weblogs or telemetry data which can bring out good sentiment, click stream analytics values with machine learning.

Here goes a good high level discussion from IoT team.

Coming back to the demo, so, first implemented a sample app for generating Accelerometer 3D events (X, Y, Z) on Windows Phone & Windows Store devices(Universal app) & pushing the generated events as block blob to Azure Service Bus Event Hub.

Attached sample code snippet.

private async void ReadingChanged(object sender, AccelerometerReadingChangedEventArgs e)

await Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
AccelerometerReading reading = e.Reading;
ScenarioOutput_X.Text = String.Format(“{0,5:0.00}”, reading.AccelerationX);
ScenarioOutput_Y.Text = String.Format(“{0,5:0.00}”, reading.AccelerationY);
ScenarioOutput_Z.Text = String.Format(“{0,5:0.00}”, reading.AccelerationZ);

//Coordinate_X = String.Format(“{0,5:00.00}”,Coordinate_X + ScenarioOutput_X.Text);
//Coordinate_Y = String.Format(“{0,5:00.00}”, Coordinate_Y + ScenarioOutput_Y.Text);
//Coordinate_Z = String.Format(“{0,5:00.00}”, Coordinate_Z + ScenarioOutput_Z.Text);
dataDetails = i +”,”+ reading.AccelerationX + “,” + reading.AccelerationY + “,” + reading.AccelerationZ;

NewDataFile += Environment.NewLine + dataDetails;

CloudStorageAccount storageAccount = CloudStorageAccount.Parse(“DefaultEndpointsProtocol=https;AccountName=yourazurestorageaccountname;


CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();

CloudBlobContainer container = blobClient.GetContainerReference(“accelerometer”);
await container.CreateIfNotExistsAsync();
//if (x == false)
// await container.CreateAsync();

CloudBlockBlob blockBlob = container.GetBlockBlobReference(newFileName);
// bool y = await blockBlob.ExistsAsync();
//if (!blockBlob.Equals(newFileName))
// await blockBlob.UploadTextAsync(dataDetails);

await blockBlob.UploadTextAsync(Headers + Environment.NewLine+ NewDataFile);


You can download the whole visual studio solution on Github.


Next challenge as usual is to send real sensor events to event hubs with accurate consumer key & publish millions of events to event hub at a time.

Here goes sample code snippet.

class Program
static string eventHubName = “youreventhubname”;
static string connectionString = GetServiceBusConnectionString();
static string data = string.Empty;
static void Main(string[] args)

string csv_file_path = string.Empty;
//string csv_file_path = @””;
string[] filePath = Directory.GetFiles(@”Your CSV Sensor Data file directory”, “*.csv”);
int size = filePath.Length;
for (int i = 0; i < size; i++)
csv_file_path = filePath[i];

DataTable csvData = GetDataTableFromCSVFile(csv_file_path);
Console.WriteLine(“Rows count:” + csvData.Rows.Count);
DataTable table = csvData;
foreach (DataRow row in table.Rows)
// Console.WriteLine(“—Row—“);
foreach (var item in row.ItemArray)

data = item.ToString();

var eventHubClient = EventHubClient.CreateFromConnectionString(connectionString, eventHubName);
//while (true)

foreach (DataRow rows in table.Rows)
var info = new Accelerometer

ID = rows.ItemArray[0].ToString(),
Coordinate_X = rows.ItemArray[1].ToString(),
Coordinate_Y = rows.ItemArray[2].ToString(),
Coordinate_Z = rows.ItemArray[3].ToString()

var serializedString = JsonConvert.SerializeObject(info);
var message = data;
Console.WriteLine(“{0}> Sending events: {1}”, DateTime.Now.ToString(), serializedString.ToString());
eventHubClient.SendAsync(new EventData(Encoding.UTF8.GetBytes(serializedString.ToString())));

catch (Exception ex)
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine(“{0} > Exception: {1}”, DateTime.Now.ToString(), ex.Message);

// Console.ReadLine();

Console.WriteLine(“Press Ctrl-C to stop the sender process”);
Console.WriteLine(“Press Enter to start now”);

// SendingRandomMessages().Wait();


public static void install()
string url = @”https://…………blob.core.windows.net/accelerometer/AccelerometerSensorData.csv&#8221;;
WebClient wc = new WebClient();
wc.DownloadFileCompleted += new AsyncCompletedEventHandler(Completed);
wc.DownloadProgressChanged += new DownloadProgressChangedEventHandler(ProgressChanged);
// Console.WriteLine(“Download OnProgress……”);

ConsoleHelper.ProgressTitle = “Downloading”;
ConsoleHelper.ProgressTotal = 10;
for (int i = 0; i <= 10; i++)
ConsoleHelper.ProgressValue = i;
if (i >= 5)
ConsoleHelper.ProgressHasWarning = true;
if (i >= 8)
ConsoleHelper.ProgressHasError = true;
ConsoleHelper.ProgressTotal = 0;
wc.DownloadFile(new Uri(url), @”\ASA\Sensors\Accelerometer\AccelerometerSensorData.csv”);
catch (Exception ex)
while (ex != null)
ex = ex.InnerException;
public static void Completed(object sender, AsyncCompletedEventArgs e)
Console.WriteLine(“Download Completed!”);

public static void ProgressChanged(object sender, DownloadProgressChangedEventArgs e)
Console.WriteLine(“{0} Downloaded {1} of {2} bytes,{3} % Complete….”,
DrawProgressBar(0, 100, Console.WindowWidth, ‘1’);

private static void DrawProgressBar(int complete, int maxVal, int barSize, char ProgressCharacter)
Console.CursorVisible = false;
int left = Console.CursorLeft;
decimal perc = (decimal)complete / (decimal)maxVal;
int chars = (int)Math.Floor(perc / ((decimal)1 / (decimal)barSize));
string p1 = String.Empty, p2 = String.Empty;

for (int i = 0; i < chars; i++) p1 += ProgressCharacter;
for (int i = 0; i < barSize – chars; i++) p2 += ProgressCharacter;

Console.ForegroundColor = ConsoleColor.Green;
Console.ForegroundColor = ConsoleColor.DarkGreen;

Console.Write(“{0}%”, (perc * 100).ToString(“N2”));
Console.CursorLeft = left;
private static DataTable GetDataTableFromCSVFile(string csv_file_path)
DataTable csvData = new DataTable();
string data = string.Empty;
using (TextFieldParser csvReader = new TextFieldParser(csv_file_path))
csvReader.SetDelimiters(new string[] { “,” });
csvReader.HasFieldsEnclosedInQuotes = true;

//read column names
string[] colFields = csvReader.ReadFields();
foreach (string column in colFields)
DataColumn datecolumn = new DataColumn(column);
datecolumn.AllowDBNull = true;
while (!csvReader.EndOfData)
string[] fieldData = csvReader.ReadFields();

for (int i = 0; i < fieldData.Length; i++)
if (fieldData[i] == “”)
fieldData[i] = null;

catch (Exception ex)

return csvData;


Now, built out ASA SQL query with specific window interval like in this demo, used ‘SlidingWindow(Second,no of interval)’ which generates computation on event hubs data based on the specific time interval mentioned in window.



Next, start implement the processed output visualization on PowerBI preview portal by selecting ‘Output’ tab of ASA job. Once, you provide all the dataset name of output & start the ASA job, on PowerBI portal, would be able to see the specific dataset is created with a small yellow star icon beside.SensorsPowerBI


Here goes a step by step demonstration with video available on my Youtube channel.

Microsoft IoT Foundation: Realtime Tweets Streaming into Azure Stream Analytics with PowerBI & PowerBI Designer Preview

The Azure Stream Analytics(ASA) is one of the major component of Microsoft #IoT foundation which has got ‘PowerBI‘ as its output connector for visualization of realtime data streaming into Event hub to Stream Analytics hub, just one month back as ‘public preview’.

In this demo, we’re going to focus to end to end realtime Tweets analytics collecting through Java code using ‘Twitter4j’ library, then store it into OneDrive storage as .csv file as well as storing it into Azure storage as block blob. Then, sending realtime tweets streamed into Service Bus Event Hubs for processing , so, after creating the stream analytics job make sure that the input connector is properly selected as data stream for ‘event hub’, then process ASA SQL query with specific ‘HoppingWindow(second,3) & ‘SlidingWindow(Minute,10,5) with overlapping/non-overlapping window frame of data streaming.

Finally , select the output connector as PowerBI & authorize with your organisational account. Once, your ASA job starts running, you would be able to see the powerbi dataset which you have selected as powerbi output dataset name, start building the ASA connected PowerBI report & Dashboard.

First, a good amount of real tweets are collected based on the specific keywords like #IoT, #BigData, #Analytics, #Windows10, #Azure, #ASA, #HDI, #PowerBI, #AML, #ADF etc.

The sample tweets are looks like this

06/24/2015 07:25:19,CodeNotFound,France,613714525431418880
06/24/2015 07:25:19,sinequa,Paris – NY- London – Frankfurt,613714525385289728
06/24/2015 07:25:20,RavenBayService,Calgary, Alberta,613714527302098944
06/24/2015 07:25:20,eleanorstenner,,613714530112274432
06/24/2015 07:25:21,ISDI_edu,,613714530758230016
06/24/2015 07:25:23,muthamiphilo,Kenya,613714541562740736
06/24/2015 07:25:23,tombee74,ÜT: 48.88773,2.23806,613714541931851776
06/24/2015 07:25:25,EricLibow,,613714547975790592

Now,  the data is sent to event hub for realtime processing & we’ve written the ASA-SQL like this.

DateTime nvarchar(MAX),
TwitterUserName nvarchar(MAX),
ProfileLocation nvarchar(MAX),
MorePreciseLocation nvarchar(MAX),
Country nvarchar(MAX),
TweetID nvarchar(MAX))
SELECT input.DateTime, input.TwitterUserName,input.ProfileLocation,
input.MorePreciseLocation,input.Country,count(input.TweetID) as TweetCount
INTO output
FROM input Group By input.DateTime, input.TwitterUserName,input.ProfileLocation,input.MorePreciseLocation,
input.Country, SlidingWindow(second,10)



Next, start build up the PowerBI report on PowerBI preview portal. Once you build the Dashboard with report by pinning the graphs, it would like something like this.



You could be able to visualize the realtime update of data like #total tweet counts on the specific keywords, #total twitterusername tweeted , #total tweetloation etc.


In another demo, we’ve used the PowerBI Designer preview tool by collecting processed tweets coming out from ASA hub to ‘Azure Blob Storage’ & then picking it into ‘PowerBI Designer Preview’.


In latest PBI , we’ve got support of combo stacked chart, which we’ve utilized to depict #average tweetcount of those specific keywords by location & timeframe for few minutes & seconds interval.


Also, you could support for well end PowerQ&A features as well like ‘PowerBI for Office 365’ which has natural language processing (NLP) backed by Azure Machine Learning processing power enabled.

like if I throw a question on these realworld streaming dataset on PowerQ&A

show tweetcount where profilelocation is bayarea & London, Auckland, India, Bangalore,Paris as stacked column chart


After that, save the PBI designer file as .pbix & upload into www.powerbi.com , under get data->Local File section. It has got support for uploading PBI designer file as well as data source connector.


Upon uploading, built out the dashboard which has got facility of schedule refresh on preview portal itself. Right click on your PBI report on portal, select settings to open the schedule refresh page.



Here goes the realtime scheduled refresh dashboard of Twitter IoT Analytics on realtime tweets.


The same PBI dashboards can be visualized from the ‘PowerBI app for Windows Store or iOS’ . Here goes a demonstration.



Worldwide EarthQuake Data Analysis (Nepal EarthQuake)- Microsoft PowerBI

Last weekend, we all were horrified by the terrible earthquake attack over Nepal, India & greater Asia, it’s continued over toll of millions of death which has several parameters to consider like ‘depth of KM’ of the earthquake, ‘magnitude of quake, severity of deaths, number of people died on quake, number of people died on shaking effect’ etc.

In this current powerbi demo, we’re using World’s dreadful earthquake incidents happened over the millennium 1900.



On , Excel power view dashboard, here depicted some the latest death toll analysis report of Nepal Earthquake 2015.




In latest, powerbi designer preview, represented worldwide country wise earthquake magnitude data analysis sorted by Total death & depth of KM of quake intensity.






%d bloggers like this: