PowerShell Script for Assigning Public IP to Azure Virtual Machine

Recently the azure virtual machines provisioned with ‘Resource Manager/Classic‘ deployment model is provisioned with Virtual IP & dynamic IP except Public IP address. Public IP address is necessary for accessing services deployed on VM from local browser.

For example, in Azure Linux VM, if you provision Apache Hadoop/Hortonworks Data Platform/Cloudera Hadoop Distribution , you may need to access the hadoop services from local browser in those addresses like hdfs service : http://<ip address of VM>:50070, http://<ip address of VM>:50075 , MapReduce Service http://<ip address of VM>:8080 etc.

In order to assign public ip for an azure Linux VM , you need to execute the following powershell script.

Get-AzureVM -ServiceName ‘azfilestorage’ -Name ‘azfilestorage’ | Set-AzurePublicIP -PublicIPName ‘linuxvmip’ | Update-AzureVM

Next, update the VM with SSH Port 22 so that , the VM can be accessed through SSH.

Get-AzureVM -Name ‘azfilestorage’ -ServiceName ‘azfilestorage’ | Set-AzureEndpoint -Protocol tcp -Name ‘SSH’ -PublicPort 22 -LocalPort 22 | Update-AzureVM



Next, you can check the public IP address on Azure Portal under ‘IP Address’ section of Azure Virtual Machine like as the screenshot.


R with PowerBI – A step by step guide approach

A lot of interests are visible everywhere how to integrate R scripts with Microsoft PowerBI dashboards. Here goes a step by step guidance on this.

Lets assume, you have some couple of readymade R code available, for example , with ggplot2 library. Lets find the following scripts performing analytics using CHOL data.

  1. Open R studio or R Package (CRAN) & install ggplot2 library first.

  2. Paste the following R script & execute it.

chol <- read.table(url(“http://assets.datacamp.com/blog_assets/chol.txt&#8221;), header = TRUE)
#Take the column “AGE” from the “chol” dataset and make a histogram it
qplot(chol$AGE , geom = “histogram”)
ggplot(data-chol, aes(chol$AGE)) + geom_histogram()

you should be able to see the visuals output like this.


3. Next, execute the following pieces of R code to find out the binwidth argument using ‘qplot()‘ function.

geom = “histogram”,
binwidth = 0.5)


4. Lets take help of hist() function in R.

#Lets take help from hist() function
binwidth = 0.5,
main = “Histogram for Age”,
xlab = “Age”,


5. Now, add I() function where nested  color.

#Add col argument, I() function where nested color.
binwidth = 0.5,
main = “Histogram for Age”,
xlab = “Age”,

I func.JPG

6. Next, adjust ggplot2 little by the following code.

#Adjusting ggplot
ggplot(data=chol, aes(chol$AGE)) +
geom_histogram(breaks=seq(20, 50, by = 2),
alpha = .2) +
labs(title=”Histogram for Age”) +
labs(x=”Age”, y=”Count”) +
xlim(c(18,52)) +


7. Plot a bar graph with this following code.

#Plotting Bar Graph
binwidth = 0.5,
main = “Bar Graph for Mort”,
xlab = “Mort”,


8. Next, open PowerBI desktop tool. You can download it free from this link. Now, click on Get Data tab to start exploring & connect with R dataset. Rscript.JPG

If you already have R installed in the same system building PowerBI visuals , you just need to paste the R scripts next in the code pen otherwise , you need to install R in the system where you are using the PowerBI desktop like this.


9. Next, you can also choose the ‘custom R visual’ in PowerBI desktop visualizations & provide the required R scripts to build visuals & finally click ‘Run’.



10. Build all the R function visuals by following the same steps & finally save the dashboard.


11.You can refresh an R script in Power BI Desktop. When you refresh an R script, Power BI Desktop runs the R script again in the Power BI Desktop environment.




Quick Installation of Single node Datazen Server in Azure Cloud Service & Sample Dashboards

The demo provides step by step guidance on quick setup of datazen server on single node server & connecting with publisher app to build custom visuals.

Pre-requisites for the demo : 

  1. An active Azure subscription.
  2. Windows 10 store(for installation of Datazen publisher)

Detailed steps are depicted as follows:

Steps Screen Shot
1.     Go to https://manage.windowsazure.com/

2.     Login with your Live ID.

3.     Click +New.

4.     Select Compute -> Virtual Machine -> From Gallery.
(Fig. 2)

5.     Select the Windows Server 2012 R2 Datacenter image.

6.     Click the next arrow at the bottom right.



7.     Enter the required information for Virtual machine configuration.

o   Virtual machine name

o   Choose Basic or Standard Tier (recommended)

o   Choose A4 as the machine size.  A4 has 8 cores, which is the minimum number of cores for a single machine setup.

o   Enter the machine admin username/password.

o   Click the next arrow.


8.     Enter the required information for Virtual machine configuration.

o   Select option Create a new cloud service.

Make sure the Cloud Service DNS Name is available.

o   Choose the subscription it should be billed to, the region it should be deployed.

Choose the one closest to your location

o   Leave the Storage Account and Availability Set as is.

o   Add an HTTP endpoint at a minimum.

You may need to scroll to add the endpoint.

o   Click the next arrow.

9.     Select Install VM Agent and leave other unchecked.

10.  Click the checkmark to start the deployment process.

You’ll see it start the provisioning process in the list of the virtual machines you are responsible for in Azure.


11.   Wait for the status to change from Starting to Running in virtual machines.  
12.   Select your VM then click Connect.  
13.   Save the RDP file to your local machine.  
14.   Open the saved Remote Desktop Connection file, then click Connect.  
15.   Connect to the VM via Remote Desktop and enter the admin username/password. VM.png
16.   Click Yes to connect to Server  
17.   Click Configure this local server in the Server Manager dashboard that appears when you login.  
18.  Then change the IE Enhanced Security Configuration to Off for Administrators.

You can always change it back if you really want to when you’re done.

19.  Close the Server Manager.

Server Manager.png



Section 2: Install the Datazen Server

1.     Navigate to the following link and download the Datazen server software onto the VM. You may need to turn off IE Enhanced Security on the server to do so.

2.     The Datazen server files download as a zipped file. Extract all the files

3.     Open Datazen Enterprise Server.3.0.2562.

4.     Click run to start the install process.

Datazen Server.JPG
5.     In the Datazen Enterprise Server Setup click Next.  Finish-DataStudio.JPG
6.     Click Next in the Setup wizard, accepting the terms in the License Agreement and moving through each screen.


7.     Click Next on the Features page


8.     Click Next on the Core Service Credentials page.  
9.     Once you get to the Admin Password page, type a Password for the Datazen admin user.  (Fig. 20)

This doesn’t have to be the same password as you used for the server.

10.  Click Next. 

11.  On the Authentication page, leave the Authentication Mode as Default.

12.  Click Next.

13.  On the Repository Encryption page, select Copy to Clipboard, then paste the key into a Notepad file.

14.  Save the Notepad file to a safe location.


15.  Click Next. 

16.  On the Instance Id page, select Copy to Clipboard then paste the Id into a Notepad file.

17.  Save the Notepad file to a safe location.


18.  Click Next. 

19.  On the Data Acquisition Service Credentials page, leave the credentials as is, then click Next.  
20.  On the Web Applications IIS Settings page leave the default settings, then click Next.   
21.  On the Control Panel Email Settings page, leave the default values since this is a test server.

22.  Click Next. 

23.  On the Ready to Install page, click Install and wait until the installation is complete.
This might take a few minutes.

Section 3: Configure the Datazen Server

1.   Open your browser in your local machine

2.   Navigate to http://mycloudservicename.cloudapp.net/cp.

Make sure you replace the yourcloudservicename with the name of your cloud service.

3.   If you can successfully connect, you should see the Control Panel Log In screen.

4.   Enter the username admin and the password you entered in the Setup wizard, then select  Log In.

5.   You will need to create a new user to start creating dashboard hubs, since you need each hub to have an owner.  The owner can NOT be the admin user.  Click Create User to create your first user.




6.   Enter a value in the top three fields (the email address can be fake if you want) and select Create User.  
7.   You will now see a new option to Create BI Hub.  activate-user.JPG
8.   Enter Hub name whatever you’d like, but make sure you enter the username of the user you just created for owner username.

9.   Enter a maximum number of users that can connect to hub.

10.  Click Create. 

BI Hub Created.JPG
11.   Finish the creation of the hub. It will be displayed in the list of available hubs.  
12.  The new hub will also be shown in the navigation menu at the bottom left of the screen.  
13.  Click the Server Users link on the left-hand side of the screen.


Server Users.png
14.  Click Create User.   
15.  Fill in the fields under Required Info.

16.  Click Create User.

17.  You will see the user and a Set password link option next to the username.

18.  Click on Set password link and then copy the link to your clipboard.
Note: This step is only required as the email notification is not set up.

19.  Logout as the admin

20.  Open a new browser window and paste the URL to reset the password into the address bar.

You can now finish setting up that user by entering the password for the account.

21.  In the Control Panel Activate User Account screen, then enter the new password, then re-type password

22.  Click Activate My Account. (Fig. 40)

23.  Logout as this user and log back in as the admin before proceeding.



Section 4: Apply a Custom Branding

1.  To add the Wide World Importers brand package to the server, save it locally. The package is provided with this demo.

2.  Click on the Branding link on the left-hand side and upload the brand package to the server.

3.  Make sure you choose the Server to upload it to.

You will see the Server icon has the Wide World Importers branding associated.


4.  To make sure it was applied properly, open a new browser and navigate to the following URL (make sure you replace the mycloudservicename with whatever you named yours)


Your Server Login screen should look as shown on the right, now having the Wide World Importers brand package applied. (Fig. 43)

Server Login


Section 5: Connect to the Datazen Server with Publisher

1.  Open the Datazen Publisher app

If this is the first time using the app, you will have the option of connecting the Datazen demo server.  We recommend doing that, so you will have some nice demo dashboards to show immediately.

2.  To add new server Right-click in the dashboard, then click Connected. (Fig. 44)

3.  Click Add New Server Connection (Fig. 45) Demo
4.  Provide the following information to connect to a Datazen server. (Fig. 46)

Server Address: mycloudservicename.cloudapp.net
User name:
user name that created
provide user password

5.  Uncheck Use Secure Connection.

6.  Click Connect. (Fig. 46)

Datazen Server Login.png
7.  When connected, you should be able to publish dashboards to your Datazen server. (Fig. 47)

8.  You will see a nice dashboard with KPIs for Wide World Importers and Fabrikam Insurance. (Fig. 47)

Datazen Screen






Resolution of Error: “This project references NuGet package(s) that are missing on this computer. Use NuGet Package Restore to download them. For more information, see http://go.microsoft.com/fwlink/?LinkID=322105. The missing file is ..\packages\Microsoft.CodeDom.Providers.DotNetCompilerPlatform.1.0.0\build\Microsoft.CodeDom.Providers.DotNetCompilerPlatform.props.”

Today I faced this issue during compilation of an asp.net 4.5.2  webforms app on Visual Studio 2015 Enterprise which was built few months back using Visual Studio 2013 Update 4  environment from VS online(TFS).

Problem Statement:

The error was highlighted during building of the project.

“This project references NuGet package(s) that are missing on this computer. Use NuGet Package Restore to download them.  For more information, see http://go.microsoft.com/fwlink/?LinkID=322105. The missing file is ..\packages\Microsoft.CodeDom.Providers.DotNetCompilerPlatform.1.0.0\build\Microsoft.CodeDom.Providers.DotNetCompilerPlatform.props.”


The error comes due to obsolete .nuget packages & wrong path indicated on app.csproj or .vbproj file. So, in order to solve the issue,

  1. First clear all folders under ‘Packages’ directory of project main directory. Packages

2. Restart the solution on VS & click on ‘manage nuget package’ from project solution explorer. Update all .nuget packages to latest version.



3. Open the project.csproj / .vbproj file on any text editor & replace the line with the appropriate line of the ‘Microsoft.Net.Compilers.1.1.1’ available in your project directory.


4. Don’t forget to comment out the following block in the project.csproj/vbproj file like as the screenshot.

<Target Name=”EnsureNuGetPackageBuildImports” BeforeTargets=”PrepareForBuild”>
<ErrorText>This project references NuGet package(s) that are missing on this computer. Use NuGet Package Restore to download them. For more information, see http://go.microsoft.com/fwlink/?LinkID=322105. The missing file is {0}.</ErrorText>
<Error Condition=”!Exists(‘..\packages\Microsoft.CodeDom.Providers.DotNetCompilerPlatform.1.0.0\build\Microsoft.CodeDom.Providers.DotNetCompilerPlatform.props’)” Text=”$([System.String]::Format(‘$(ErrorText)’, ‘..\packages\Microsoft.CodeDom.Providers.DotNetCompilerPlatform.1.0.0\build\Microsoft.CodeDom.Providers.DotNetCompilerPlatform.props’))” />
<Error Condition=”!Exists(‘..\packages\Microsoft.Net.Compilers.1.0.0\build\Microsoft.Net.Compilers.props’)” Text=”$([System.String]::Format(‘$(ErrorText)’, ‘..\packages\Microsoft.Net.Compilers.1.0.0\build\Microsoft.Net.Compilers.props’))” />
<Error Condition=”!Exists(‘..\packages\Microsoft.Net.Compilers.1.0.0\build\Microsoft.Net.Compilers.props’)” Text=”$([System.String]::Format(‘$(ErrorText)’, ‘..\packages\Microsoft.Net.Compilers.1.0.0\build\Microsoft.Net.Compilers.props’))” />


~Happy Troubleshooting!!

Azure Stream Analytics & Machine Learning Integration With RealTime Twitter Sentiment Analytics Dashboard on PowerBI

Recently, it has been introduced the integration of ASA & AML available as preview update & it’s possible to add AML web service URL & API key as ‘custom function‘ with ASA input. In this demo, realtime tweets are collected based on keywords like ‘#HappyHolidays2016‘, ‘#MerryChristmas‘, ‘#HappyNewYear2016‘ & those are directly stored on a .csv file saved on OneDrive. Here goes the solution architecture diagram of the POC.




Now, add the Service Bus event hub endpoint as input to the ASA job, while deploy the ‘Twitter Predictive Sentiment Analytics Model‘  & click on ‘Open in Studio‘ to start deploy the model. Don’t forget to run the solution before deploying.



Once the model is deployed, open the ‘Web Service‘ dashboard page to get the model URL & API key, click on default endpoint -> download the excel 2010 or earlier apps. Collect the URL & API key to apply it to ASA function credentials for AML deployment.


Next, create an ASA job & add the event hub credentials where the real world tweets are getting pushed & click on ‘Functions‘ tab of ASA job to add the AML credentials. Provide model name, URL & API key of the model & Once, it’s added, click on Save.



Now, add the following ASA SQL to aggregate the realtime tweets sentiment scores coming out from predictive twitter sentiment model.



Provide the output as Azure Blob storage, add a container name & serialization type as CSV & start the ASA job. Also, start importing data into PowerBI desktop from the ASA output Azure blob storage account.




PowerBI desktop contains in-built power Query to start preparing the ASA output data & processing data types. Choose the AML model sentiment score datatype as decimal type & TweetTexts as Text(String) type.



Start building the ‘Twitter Sentiment Analytics‘ dashboard powered by @AzureStreaming & Azure Machine Learning API with realworld tweet streaming, there’re some cool custom visuals are available on PowerBI.  I’ve used some visuals here like ‘wordcloud‘ chart which depicts some of the highly scored positive sentiment contained tweets with most specific keywords like ‘happynewyear2016‘, ‘MerryChristmas‘,’HappyHolidays‘ etc.



While, in the donut chart, the top 10 tweets with most positive sentiment counts are portrayed with the specific sentiment scores coming from AML predictive model experiment integrated with ASA jobs.


~Wish you HappyHolidays 2016!

A lap around Microsoft Azure IoT Hub with Azure Stream Analytics & IoT Analytics Suite

Last month on #AzureConf 2015, the Azure IoT Suite has been announced to be available for purchase along with the GA release of Azure IoT Hub. The IoT Hub helps to control, monitor & connect thousands of devices to communicate via cloud & talk to each other using suitable protocols. You can connect to your Azure IoT Hub using the IoT Hub SDKs available in different languages like C, C#, Java, Ruby etc. Also, there’re monitoring devices available like device explorer or iothub-explorer. In this demo, Weather Data Analytics is demonstrated using Azure IoT Hub with Stream Analytics powered by Azure IoT Suite & visualized using Azure SQL database with PowerBI.

You can provision your own device into Azure IoT analytics Suite using device explorer or iothub-explorer tool & start bi-directional communication through device-cloud & cloud-device.

First, create your Azure IoT Hub from Azure Preview Portal  by selecting New-> Internet of Things -> Azure IoT Hub. Provide hub name, select pricing & scale tier[F1 – free(1/subscription, connect 10 devices, 3000 messages /day), [S1 – standard (50,000 messages/day) & S2- standard(1.5 M messages/day)] for device to cloud communication. Select IoT Hub units, device to cloud partitions, resource group, subscription & finally location of deployment(currently it’s available only in three locations- ‘East Asia’, ‘East US’, ‘North Europe’.




Once the hub is created, next switch to device explorer to start creating a device, for details about to create a device & register, refer to this Github page. After registering the device, move back to  ‘Data‘ tab of device explorer tool & click on ‘Monitor‘ button to start receive device-cloud events sent to Azure IoT Hub from device.



The schema for the weather dataset looks like the following data & fresh data collected from various sensors & feed into Azure IoT Hub which can be viewed using Device Explorer tool.



In order to push data from weather data sensor device to Azure IoT hub, the following code snippet needs to be used. The full code-snipped is going to be available on my Github page.


using System;
using System.Text;
using System.Threading.Tasks;
using System.IO;
using System.Data;
using Newtonsoft.Json;
using Microsoft.VisualBasic;
using Microsoft.VisualBasic.FileIO;

namespace Microsoft.Azure.Devices.Client.Samples
class Program
private const string DeviceConnectionString = “Your device connection-string”;
private static int MESSAGE_COUNT = 5;
static string data = string.Empty;

static void Main(string[] args)
DeviceClient deviceClient = DeviceClient.CreateFromConnectionString(DeviceConnectionString);

if (deviceClient == null)
Console.WriteLine(“Failed to create DeviceClient!”);

catch (Exception ex)
Console.WriteLine(“Error in sample: {0}”, ex.Message);

static async Task SendEvent(DeviceClient deviceClient)
string[] filePath = Directory.GetFiles(@”\Weblog\”,”*.csv”);
string csv_file_path = string.Empty;
int size = filePath.Length;
for(int i=0; i< size; i++)
csv_file_path = filePath[i];

DataTable csvData = GetDataTableFromCSVFile(csv_file_path);
Console.WriteLine(“Rows count:” + csvData.Rows.Count);
DataTable table = csvData;
foreach(DataRow row in table.Rows)
foreach(var item in row.ItemArray)
data = item.ToString();

foreach(DataRow rows in table.Rows)
var info = new WeatherData
weatherDate = rows.ItemArray[0].ToString(),
weatherTime = rows.ItemArray[1].ToString(),
apperantTemperature = rows.ItemArray[2].ToString(),
cloudCover = rows.ItemArray[3].ToString(),
dewPoint = rows.ItemArray[4].ToString(),
humidity = rows.ItemArray[5].ToString(),
icon = rows.ItemArray[6].ToString(),
pressure = rows.ItemArray[7].ToString(),
temperature = rows.ItemArray[8].ToString(),
timeInterval = rows.ItemArray[9].ToString(),
visibility = rows.ItemArray[10].ToString(),
windBearing = rows.ItemArray[11].ToString(),
windSpeed = rows.ItemArray[12].ToString(),
latitude = rows.ItemArray[13].ToString(),
longitude = rows.ItemArray[14].ToString()

var serializedString = JsonConvert.SerializeObject(info);
var message = data;
Console.WriteLine(“{0}> Sending events: {1}”, DateTime.Now.ToString(), serializedString.ToString());
await deviceClient.SendEventAsync(new Message(Encoding.UTF8.GetBytes(serializedString.ToString())));

catch(Exception ex)
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine(“{0} > Exception: {1}”, DateTime.Now.ToString(), ex.Message);
// Task.Delay(200);


Console.WriteLine(“Press Ctrl-C to stop the sender process”);
Console.WriteLine(“Press Enter to start now”);

//string dataBuffer;

//Console.WriteLine(“Device sending {0} messages to IoTHub…\n”, MESSAGE_COUNT);

//for (int count = 0; count < MESSAGE_COUNT; count++)
// dataBuffer = Guid.NewGuid().ToString();
// Message eventMessage = new Message(Encoding.UTF8.GetBytes(dataBuffer));
// Console.WriteLine(“\t{0}> Sending message: {1}, Data: [{2}]”, DateTime.Now.ToLocalTime(), count, dataBuffer);

// await deviceClient.SendEventAsync(eventMessage);

private static DataTable GetDataTableFromCSVFile(string csv_file_path)
DataTable csvData = new DataTable();
string data = string.Empty;
using (TextFieldParser csvReader = new TextFieldParser(csv_file_path))
csvReader.SetDelimiters(new string[] { “,” });
csvReader.HasFieldsEnclosedInQuotes = true;

//read column names
string[] colFields = csvReader.ReadFields();
foreach (string column in colFields)
DataColumn datecolumn = new DataColumn(column);
datecolumn.AllowDBNull = true;
while (!csvReader.EndOfData)
string[] fieldData = csvReader.ReadFields();

for (int i = 0; i < fieldData.Length; i++)
if (fieldData[i] == “”)
fieldData[i] = null;

catch (Exception ex)
Console.WriteLine(“Exception” + ex.Message);
return csvData;

static async Task ReceiveCommands(DeviceClient deviceClient)
Console.WriteLine(“\nDevice waiting for commands from IoTHub…\n”);
Message receivedMessage;
string messageData;

while (true)
receivedMessage = await deviceClient.ReceiveAsync(TimeSpan.FromSeconds(1));

if (receivedMessage != null)
messageData = Encoding.ASCII.GetString(receivedMessage.GetBytes());
Console.WriteLine(“\t{0}> Received message: {1}”, DateTime.Now.ToLocalTime(), messageData);

await deviceClient.CompleteAsync(receivedMessage);

You could check output to events sending from device to cloud on console.


Next, start pushing the device data into Azure IoT Hub & monitor the events receiving process through device explorer. Now, start provisioning an Azure Stream Analytics Job on Azure portal. Provide ‘Azure IoT Hub‘ as an input to the job like as the followings.





Now provide Azure Stream Analytics Query to connect incoming unstructured datasets from device to cloud to pass into Azure SQL database. So, first, provision a SQL database on Azure & connect to as output to Stream Analytics job.

create table input(
weatherDate nvarchar(max),
weatherTime datetime,
apperantTemperature nvarchar(max),
cloudCover nvarchar(max),
dewPoint nvarchar(max),
humidity nvarchar(max),
icon nvarchar(max),
pressure nvarchar(max),
temperature nvarchar(max),
timeInterval nvarchar(max),
visibility nvarchar(max),
windBearing nvarchar(max),
windSpeed nvarchar(max),
latitude nvarchar(max),
longitude nvarchar(max)
select input.weatherDate, input.weatherTime,input.apperantTemperature,input.cloudCover,
input.dewPoint, input.humidity,input.icon,input.pressure,count(input.temperature) as avgtemperature, input.timeInterval, input.visibility, input.windBearing,

into weathersql
from input
group by input.weatherDate, input.weatherTime, input.apperantTemperature,input.cloudCover,
input.dewPoint, input.humidity,input.icon, input.pressure,input.timeInterval,input.visibility, input.windBearing,
input.windSpeed,input.latitude,input.longitude, TumblingWindow(second,2)


Specify the output of ‘WeatherIoT’ ASA job as ‘Azure SQL Database‘, alternatively, you can select any of the rest of the connectors like ‘Event Hub’, ‘DocumentDB’ etc.



Make sure that , to create the necessary database & table first on SQL before adding as output to ASA job. For this demo, I have created the ‘weatheriot‘ table on Azure SQL database. The t-sql query looks like this.



Next, start the ASA job & receive the final Azure IoT hub(device to cloud) data processed to IoT hub ->ASA -> Azure SQL database pipeline. Once you receive data on your Azure SQL table. Start building the PowerBI ‘Weather IoT Data Analytics’ dashboard for visualization & to leverage the power of Azure IoT momentum.


Connect to PowerBI connected through same account of Azure subscription where you provisioned the ASA job & start importing data from Azure SQL database. Create stunning reports using funnel, donut, global map charts with live data refresh.


For this demo, I’ve populated charts on average weather temperature, pressure, humidity, dew point forecasting analysis over specific areas based on latitude & longitude values, plotted & pinned into PowerBI ‘Weather Data Azure IoT Analytics’ dashboard.



Deployment of Cloudera Enterprise 5.4.4(CDH 5) on Microsoft Azure Virtual Machine & Running Impala shell as single node cluster

Deployment of Cloudera Enterprise (CDH) 5.4.4 can be implemented directly on Microsoft Azure Virtual Machines  & we can start working on Impala shell & Hue itself.

The hosting process is super easy, just need to make sure the following prerequisites & troubleshooting steps should be taken care off.

Prerequisites :

  1. SELinux should be disabled,

Before disabling SELinux you may try sysctl -w vm.swappiness=0.

You have to add the line below in /etc/sysctl.conf to keep your change permanently:

vm.swappiness = 10

  1.  Change the root password
  2. Change the hostname in /etc/hosts file
  3. Add ports 7180, 7182, 9000, 9001 open
  4. Passwordless sudo user authentication
  5. Change the /etc/hosts file , the hostname from hosts IP address by $ifconfig

 Issue: Cloudera Manager site is not opening on browser after installation & the following error shows on log

cloudera-scm-server dead but pid file exists

Follow the steps:

# service cloudera-scm-server stop

# service cloudera-scm-server-db stop

# rm /var/run/cloudera-scm-server.pid

# service cloudera-scm-server-db start

# service cloudera-scm-server start

Details about the step by step process of deployment of CDH 5 on MS Azure Virtual Machine(RHEL 6.x) can be viewed on YouTube channel.

%d bloggers like this: