Microsoft Graph and Client ID’s

Microsoft Graph and Client ID’s

Recently I have been working in PowerShell to access Microsoft Graph, primarily to interact with Intune. My starting point has been the Samples provided by Microsoft here: https://github.com/microsoftgraph/powershell-intune-samples while extending the samples to modify the AAD user objects I started getting Access Denied even with a Global Admin account, really weird right, well it comes down to the $clientid variable in the samples which are for the “Microsoft Intune PowerShell” Service Principal, I needed to use the “Microsoft Azure PowerShell” Service Principal.

Sounds easy enough right? After spending a little time scouring the internet I couldn’t find the “ClientID” for Azure Active Directory, in the end I got around it by finding the GUID on another blog (Don’t recall the Blog).

Which got me thinking, this information should be stored somewhere on the Tenant, and after much investigation I’ve fond the Service Principal’s via using the AzureAD PowerShell module with the “Get-AzureADServicePrincipal” command, if your tenant is look mine you will have a flash of objects fly by we have over 300+ objects in ours, luckily I have demo Tenant which I can run Get-AzureADServicePrincipal to obtain the full list of ClientID’s which are there by default, with Intune enabled we have 21 objects, in this solution we are only looking for the 2 ClientID’s for Microsoft Azure PowerShell and Microsoft Azure PowerShell, which are (The full connection scripts for PowerShell to Microsoft Graph have a look in the samples in the above link):

AAD ClientID = “1950a258-227b-4e31-a9cf-717495945fc2”

Intune ClientID = “d1ddf0e4-d672-4dae-b554-9d5bdfd93547”

To note when you return the full list of the Service Principal’s with PowerShell the ClientID is named AppID.

Depending upon how the Service Principal’s have been configured you might not be able to use the AppID in PowerShell to invoke commands on Microsoft Graph.

Good Luck

Steve.

Send an email with GraphAPI and PowerShell

Send an email with GraphAPI and PowerShell

To follow on from this blog post where we interrogated the GraphAPI to obtain information regarding the user object which invoked the call. We are going to use GraphAPI and PowerShell to Send an email, I know what you’re say, “But Steve I can just use the Send-MailMessage commandlet” which is true, but what the this solution gets you is a record in your Sent Items of the email. All of this without even needing to think about setting up anything to do with SMTP.
So the major difference from the last script to this one is that we need to construct a JSON object, which we then POST to GraphAPI, rather then using a GET function like we did previously. An example of this is:

In my sample I have kept it simple with a single recipient, a Subject, and body. But there is a vast amount of information which you can define for the email your sending, you can find more about it here.
For a full copy of script please look here

Good Luck

Steve

PowerShell GraphAPI

PowerShell GraphAPI

In the last blog post we went over what the GraphAPI is and why it’s all powerful, with this post we are going to go over how you can leverage the GraphAPI via PowerShell.
First things first we need to register an application at https://apps.dev.microsoft.com/ this is pretty straight forward, just make sure you save the Client Secret when you create it as you will only see it once! the other thing to note is that you need add a platform as “Web” this address does not need to resolve, nor does your PowerShell Scripts need to use it.
The other important part on this page is to allocate permissions to the application which all depends upon what you want to interrogate the GraphAPI, for more information have a look here where Microsoft has documented the process in more detail then I have.

So we now have the Application ID from the new application, the Client Secret, and the Redirect Address, we can now create a simple PowerShell script like the below (update the Variables to match your environment).

Once you have updated the Script to match your environment you can now execute it and you should see something like this:

@odata.context : https://graph.microsoft.com/beta/$metadata#users/$entity
id : 2f2916e5-953c-4c5d-9f52-bd5db8131d49
accountEnabled : True
assignedLicenses : {@{disabledPlans=System.Object[]; skuId=c7df2760-2c81-4ef7-b578-5b5392b571df}, @{disabledPlans=System.Object[]; skuId=061f9ace-7d42-4136-88ac-31dc755f143f}}
assignedPlans : {@{assignedDateTime=2017-01-18T19:43:48Z; capabilityStatus=Enabled; service=SharePoint; servicePlanId=e95bec33-7c88-4a70-8e19-b10bd9d0c014}, @{assignedDateTime=2017-01-18T19:43:48Z; capabilityStatus=Enabled;
service=SharePoint; servicePlanId=5dbe027f-2339-4123-9542-606e4d348a72}, @{assignedDateTime=2017-01-18T19:43:48Z; capabilityStatus=Enabled; service=exchange; servicePlanId=efb87545-963c-4e0d-99df-69c6916d9eb0},
@{assignedDateTime=2017-01-18T19:43:48Z; capabilityStatus=Enabled; service=MicrosoftCommunicationsOnline; servicePlanId=0feaeb32-d00e-4d66-bd5a-43b5b83db82c}…}
businessPhones : {}
city :
companyName :
country :
department :
displayName : Steven Hosking
givenName : Steven
jobTitle :
mail : mail@AusIgnite2017.onmicrosoft.com
mailNickname : steve
mobilePhone :
onPremisesImmutableId :
onPremisesLastSyncDateTime :
onPremisesSecurityIdentifier :
onPremisesSyncEnabled :
passwordPolicies :
passwordProfile :
officeLocation :
postalCode :
preferredLanguage : en-AU
provisionedPlans : {@{capabilityStatus=Enabled; provisioningStatus=Success; service=exchange}, @{capabilityStatus=Enabled; provisioningStatus=Success; service=exchange}, @{capabilityStatus=Enabled; provisioningStatus=Success;
service=exchange}, @{capabilityStatus=Enabled; provisioningStatus=Success; service=exchange}…}
proxyAddresses : {SMTP:mail@AusIgnite2017.onmicrosoft.com}
refreshTokensValidFromDateTime : 2017-01-18T07:01:55Z
showInAddressList :
state :
streetAddress :
surname : Hosking
usageLocation : AU
userPrincipalName : mail@AusIgnite2017.onmicrosoft.com
userType : Member

If you want to test other options you can change the URL on Line 2 of the script “$uri = “https://graph.microsoft.com/beta/me/”” to query anything which has been published via the GraphAPI, you can find more information here and browsing through the options on the left of the page.

You can also find all of the references of the Beta calls for the GraphAPI which at the moment includes all of the calls for Intune, these might not work on your subscription today, these GraphAPI calls will be green lit for your subscription once it has been migrated over to the Ibiza portal (portal.azure.com).

Good Luck

Steve

What the heck is the GraphAPI and why should I care?

What the heck is the GraphAPI and why should I care?

So recently I have been spending time learning about the new Preview Intune Portal which is moving over to the Ibiza Azure Portal. As part of this migration from the existing Silverlight Intune portal to the new Ibiza portal Microsoft is working on exposing a vast amount of information (if not all) for your Intune Subscription via the GraphAPI. I’m going to quote from the Microsoft Graph website here the GraphAPI is “One endpoint to rule them all”, I’m sure there is a joke about rings and volcano’s that could be put in here.
But in all seriousness GraphAPI is well on it’s way to living up to that tag line, today as i write this there is already a vast amount of objects from the Office world which are already available to interrogate for example you can do a simple “Get” request on: https://graph.microsoft.com/v1.0/me you will be able to see a JSON response. I’m sure you have just clicked on the link and you got a JSON reply looking something like this:
{
“error”: {
“code”: “InvalidAuthenticationToken”,
“message”: “Bearer access token is empty.”,
“innerError”: {
“request-id”: “c6a2dcfe-4ee6-489a-9de3-a5573c6e2576”,
“date”: “2017-01-27T03:35:07”
}
}
}

So as you might have guessed by now it’s not completely straight forward to interrogate the GraphAPI, this is actually a really good thing as you need to provide an access token to be able to query the data from GraphAPI. To test a query you can use the Graph Explorer which you can find here: https://graph.microsoft.io/en-us/graph-explorer
By default it will be signed in with a “Demo Tenant” where you can test out the API calls, but once you sign in you can interrogate your own data you will be prompted to allow the Group Explorer access to your data during logon so you know what type of items which the GraphAPI has access too.
If we run the same link as we previously ran in the GraphAPI you will get a JSON reply which looks something like this:
{
“@odata.context”: “https://graph.microsoft.com/v1.0/$metadata#users/$entity”,
“id”: “2f2916e5-953c-4c5d-9f52-bd5db8131d49”,
“businessPhones”: [],
“displayName”: “Steven Hosking”,
“givenName”: “Steven”,
“jobTitle”: null,
“mail”: “steve@AusIgnite2017DEMO.onmicrosoft.com”,
“mobilePhone”: null,
“officeLocation”: null,
“preferredLanguage”: “en-AU”,
“surname”: “Hosking”,
“userPrincipalName”: “steve@AusIgnite2017DEMO.onmicrosoft.com”
}

As you can see GraphAPI is quite secure, in that you can’t just randomly query the GraphAPI objects from another company without a Access Token. In this demo the Graph Explorer handles the process of obtaining access to your Subscription be it Intune, Office 365, or Azure Active Directory.
It’s also worth noting as of today to access the Intune sections of the API you need to change the v1.0 to Beta in the URL’s, in saying that not all Intune subscriptions have currently been migrated to the Ibiza Portal which enables the GraphAPI so you might not be able to test this, to find more information regarding the Intune GraphAPI have a look here as it will be continually updated during the roll of the Ibiza Portal support.

In future blog posts I will explain the process to query the GraphAPI with PowerShell.

Good Luck

Steve

Rerouting PowerShell text Streams to Application Insights

Rerouting PowerShell text Streams to Application Insights

So recently I have been working on a solution where we have blended scripting languages being PowerShell and C#, and found logging to be a little painful in the sense that if you use a single file to track the process which would be ideal there is a good chance there will be write locks and etc. between the scripts. With that in mind I started digging deep into Application Insights for the C# solution, and tracking down the processes for using Application Insights in PowerShell, the process for both of the languages seem pretty straight forward, load the DLL add your instrument key from your Application Insights subscription, then start sending messages up to the service.

Great so we have a logging solution which will go across both languages with ease now the next step is to implement the logging, for a web site written in C# this is a matter of installing the nuget package, and configure the solution to use the correct subscription, it will then capture the cool things like time it takes for page loads and etc. using the default configuration, all in all about 20-30 minutes to capture basic audit information.

For PowerShell it’s not so straight forward as you need explicitly state each time you want to log something, so we need to update the script, this is where i started balking as the script in question were a mess of over 2000 lines over 2 different files, but i found that if i use the -Verbose switch on the top-level function it will return a detailed step by step of whats being completed, and with the standard PowerShell redirection “*>&1” it will pipe it out to a text file easily, great we are now back at writing it into a log file not application insights.

This got me thinking can i pipe the text streams from the function into another function? The simple answer is yes it is possible, great i can just push the text directly up to Application Insights fantastic, by default I get a GMT time stamp, and i know where it was invoked from.

But a text message out of context is as about as useful as disconnected log files, here is where i found out something about these messages which I hadn’t realized, that they are actually objects which contain the line where the script has thrown the message from, along with the PS1 file which it was invoked from. Great this makes it super easy, well not quite each of the different streams are a different object unique to the object type, with the exception of Write-Output which is actually a “System.String” object type which makes it hard to capture (not impossible tho).

So with the above in mind here is a sample script which shows the process of redirecting all of the different text stream’s from a function up to Application Insights, I also included a switch to have the results print back on the screen if it is still required to action it.

So i hear you think great you’re using Application Insights but i don’t have access to that, well here is the cool part you can change out the lines which refer to Application Insights for your favorite logging tool, or even have a default action which will stop the script if there is an exception detected in the script.

Good Luck

 

Replace NSLookup with Resolve-DnsName

Replace NSLookup with Resolve-DnsName

So we have all been there where we need to confirm that the DNS record exists, or you have a script which needs the IP address rather than the DNS address. Traditionally we would use NSLookup to validate all of this.
Recently I have needed to complete a solution based in PowerShell that I needed to create a DNS record and then validate that it exists, and that the IP address has been set correctly.
Rather than writing heaps of extra code to handle the result from NSLookup in PowerShell, or to instantiate the DNS .Net object, I spent the time to look into the PowerShell options.
This is where I stumbled upon Resolve-DnsName which is fantastic, it allows you to not only complete a generic lookup of the DNS address, but you can use the -type parameter for example -type A will bring back only the A name records for that DNS address for example:

blogpic1
If we were to then run the same query with AAAA as the type (IPv6 Address) we get the following:

blogpic2
Where the Commandlet comes into its own is that you can save it into an object and then use the result to complete your tasks which require an IP address, for example like this:

blogpic3
Obviously this is just scratching the surface of what you can do with the commandlet, but hopefully a starting point for people to think about using more PowerShell.
Good Luck
Steve

How to install Azure Stack TP2 on a server with less than 12 CPU Cores

How to install Azure Stack TP2 on a server with less than 12 CPU Cores

Currently out of the box (binaries) Azure Stack TP2 won’t install on anything less than 12 Cores on the host server be it physical or Virtual, (Yes you can install Azure Stack in Nested Virtualization to get around the Physical Disk allocation issue, but be aware there is a Blue Screen Bug when running it so not advised.)

If you’re like me and can’t get your hands on shining new kit for a lab/testing environment and need to make do with DL380 G6’s which have 8 cores and a truck load of RAM and the required number of disk which worked fast enough for Azure Stack TP1 it doesn’t work out of the box for Azure Stack TP2, so below is the process we worked out to allow for the installation of Azure Stack TP2 on under speced system.

So you have already found the config.xml file located in c:\CloudDeployment which looks something like below, and you have already done a search on the file and found the MinimumNumberOfCoresPerMachine value and updated it to the number of cores which you have, from here you reran the PowerShell command and it failed. The next step was to search the internet and find this blog.

File Edit Format View Help encoding= (Role (Publiclnfo> < Logs > <Fi1eLog Location <Fi1eLog Location <Fi1eLog Location <Fi1eLog Location "$env : SystemDrive\C10udDep10yment\Logs "*env : / > "$env : \ " / > " $env : \IPInformation . txt" / > <WindowsEventLog / > <WindowsEventLog Pattern="App1ication"/> < / Logs > Guid]</Dep10ymentGuid> </Pub1icInfo> (Privatelnfo> < ! Do not change the Deployment Guid. For re-deploying update (ExecutionContext IdempotentRun="True" < Ski pDriverInj / SkipDriverInj ection> < 3mdWaitTimeoutMinutes> / 8mdWaitTimeoutMinutes> < / 3MCWinPETimeOutInMinutes> IdempotentRun ' property to "False" < /TimeServiceStartTimeoutInMinutes> < Us eWim300t>F a Is / Us eWim800t> < / Ignore01dOSCheckResu1t> < Install Image Path " Index="" VHDName="" <LibraryShareImageF01der <VhdImageTargetDir Path < Sta / Sta rtupTimeoutInMinutes> < Shutdown / Shutdown TimeoutInMinutes> (Accounts) < / LocalAdminAccountID> <RunAsAccountID>Fabric</RunAsAccountID> < DomainAdminAccountID> Doma inAdmin< / Doma inAdminAccountID> / ChassisManagerAccountId> LocalAdmin< / BuiltInAdminAccountID> </Accounts> (ValidationRequirements> <MinimumOperati /MinimumOperatingSystemVersion> (Min PerMa ch in e) /Min imumNumberOfCores PerMa ch in e) /MinimumPhysica1MemoryPerMachineG3> <MinimumNumberOfAdaptersPerMa chin /MinimumNumberOfAdaptersPerMa chin e) /MinimumAdapterSpeed3itsPerSecond>

So now to find the real config file which is used to install the bare metal(host machine) for Azure Stack, this is located here: C:\CloudDeployment\Configuration\Roles\Infrastructure\BareMetal\OneNodeRole.xml easy right, we make the change to the MinimumNumberOfCoresPerMachine again save the file then rerun the PowerShell Command and it fails again… now this is getting annoying.

After some digging we find that during the installation of Azure Stack it creates a group of encrypted files which are located here: C:\EceStore these contain an encrypted version of the Config file from the first time you ran the PowerShell script, So we will delete this folder, and the config.xml file from C:\CloudDeployment and rerun the whole process from the start. Presto the installation process will complete this time around.

Good Luck,

Steve

Replace Telnet with Test-NetConnection PowerShell Module

Replace Telnet with Test-NetConnection PowerShell Module

While troubleshooting some issues for a client this week I needed to test that a port was open. Traditionally our first step for this would to reach for Telnet. But given it is 2016 and telnet has been a feature that has been intentionally removed by default from all Windows OS for the last 4 years as a result of the security concerns, I don’t have it enabled on my laptop.  

So, the question is if not Telnet what do I use for testing a port? Well the easy answer is: PowerShell

The CmdLet in question is “Test-NetConnection” sounds like it would just do a ping, which doesn’t really help me right? Well that’s true, but you can then use the -Port parameter which allows you to define the port which you wish to send the packet down to test that the port is open.

Here is an example of how to use the command. The first usage in the image is a failed reply, while the second usage is a successful response.

blog-image-for-test-netconnection

Good Luck.

Steve

Windows Azure Pack PowerShell SQL Resource Provider Add SQL Server Hosting Group to Plan

Windows Azure Pack PowerShell SQL Resource Provider Add SQL Server Hosting Group to Plan

This post equates to the 5th in the series of blog posts for Azure Pack, where we have gone over the process of obtaining, adding, and removing the core components of Azure Pack being the User, Plans and Subscription objects, then we moved onto the process of Adding & removing the SQL Server hosting groups, and there dependent SQL servers in the SQL Resource Provider. So in this post we will complete the circle on the administrative side by adding the newly created SQL Server group to a Plan.

Unlike the past posts where the concepts were quite self-explanatory when attaching the SQL Server hosting group to a Plan, it needs to be completed in a two-step process, the first is to associate the SQL Resource Provider to the Plan, then once it has been attached there is a requirement to allocate the actual resources by using a Quota.

From here your users will be able to create their own databases on the presented SQL Servers.

# query SQL function

function Invoke-SQL {

param([string] $connstring,[string] $sqlCommand)

$connectionString = $connstring

$connection = new-object system.data.SqlClient.SQLConnection($connectionString)

$command = new-object system.data.sqlclient.sqlcommand($sqlCommand,$connection)

$connection.Open()

$adapter = New-Object System.Data.sqlclient.sqlDataAdapter $command

$dataset = New-Object System.Data.DataSet

$adapter.Fill($dataSet) | Out-Null

$connection.Close()

$dataSet.Tables

}

 

# get WAP connection string

$connstring = (Get-MgmtSvcSetting -Namespace adminsite | where {$_.name -eq ‘ApplicationServicesConnectionstring’}).value

$connstring = $connstring.Split(‘;’)[0] + ‘;’ + $connstring.Split(‘;’)[1] + “; Integrated Security=SSPI”

 

# get WAP websites

$adminsite = (Invoke-SQL -sqlCommand “SELECT value FROM [Config].[Settings] where Namespace = ‘AdminSite’ and Name = ‘Authentication.Fqdn'” -connstring $connstring).value

$TenantSite = (Invoke-SQL -sqlCommand “SELECT value FROM [Config].[Settings] where Namespace = ‘TenantSite’ and Name = ‘Authentication.Fqdn'” -connstring $connstring).value

$AuthSite = (Invoke-SQL -sqlCommand “SELECT value FROM [Config].[Settings] where Namespace = ‘AuthSite’ and Name = ‘Authentication.Fqdn'” -connstring $connstring).value

$windowsauthsite = (Invoke-SQL -sqlCommand “SELECT value FROM [Config].[Settings] where Namespace = ‘WindowsAuthSite’ and Name = ‘Authentication.Fqdn'” -connstring $connstring).value

$AdminAPI = (Invoke-SQL -sqlCommand “SELECT value FROM [Config].[Settings] where Namespace = ‘AdminSite’ and Name = ‘Microsoft.Azure.Portal.Configuration.OnPremPortalConfiguration.RdfeAdminUri'” -connstring $connstring).value

$TenantAPI = (Invoke-SQL -sqlCommand “SELECT value FROM [Config].[Settings] where Namespace = ‘TenantSite’ and Name = ‘Microsoft.Azure.Portal.Configuration.AppManagementConfiguration.RdfeUnifiedManagementServiceUri'” -connstring $connstring).value

$ClientRealm = (Invoke-SQL -sqlCommand “SELECT value FROM [Config].[Settings] where Namespace = ‘AdminSite’ and Name = ‘Authentication.RelyingParty'” -connstring $connstring).value.Split(‘,’)[1].replace(‘”Realm”:”‘,).replace(‘”‘,)

 

# Check is cert is signed

$admin = $adminsite.Split(‘:’)

$cert = !(New-Object System.Net.Security.SslStream((New-Object System.Net.Sockets.TcpClient($admin[1].Replace(‘/’,),$admin[2])).GetStream())).IsSigned

 

# get token

$token = Get-MgmtSvcToken -AuthenticationSite $windowsauthsite -ClientRealm $ClientRealm -Type Windows -DisableCertificateValidation:$cert

 

# Add SQL Server Group to Plan

$ResProviderName = ‘sqlservers’

$PlanName = ‘steve’

$plan = Get-MgmtSvcPlan -AdminUri $AdminAPI -Token $token -DisableCertificateValidation:$cert -DisplayName $PlanName

$ResourceProvider = Get-MgmtSvcResourceProvider -AdminUri $AdminAPI -Token $token -DisableCertificateValidation:$cert -Name $ResProviderName

Add-MgmtSvcPlanService -AdminUri $AdminAPI -Token $token -DisableCertificateValidation:$cert -InstanceId $ResourceProvider.InstanceId -ServiceName $ResourceProvider.Name -PlanId $plan.ID

 

# Add SQL Quota to Plan

$ServiceName = ‘sqlservers’

$QuotaList = New-MgmtSvcQuotaList

$ResProvider = Get-MgmtSvcResourceProvider -AdminUri $AdminAPI -Token $token -DisableCertificateValidation:$cert -Name $servicename

$SQLQuota = Add-MgmtSvcListQuota -QuotaList $QuotaList -ServiceName $ServiceName -ServiceInstanceId $ResProvider.InstanceId

$sqlquotares = Add-MgmtSvcQuotaSetting -Quota $SqlQuota -Key Editions -Value ‘[{“displayName”:”Laptop”, “groupName”:”Laptop”, “resourceCount”:”1″, “resourceSize”:”1024″, “resourceSizeLimit”:”1024″, “offerEditionId”:”081313063701″, “groupType”:null}]’

Update-MgmtSvcPlanQuota -AdminUri $AdminAPI -Token $token -QuotaList $QuotaList -PlanId $plan.Id -DisableCertificateValidation:$cert

 

# Remove SQL Server group from plan

$ResProviderName = ‘sqlservers’

$PlanName = ‘steve’

$plan = Get-MgmtSvcPlan -AdminUri $AdminAPI -Token $token -DisableCertificateValidation:$cert -DisplayName $PlanName

remove-MgmtSvcPlanService -adminuri $adminapi -token $token -disablecertificatevalidation:$cert -ServiceName $ResProviderName -PlanId $plan.id

 

# Get list of Services attached to Plan

$PlanName = ‘steve’

$plan = Get-MgmtSvcPlan -AdminUri $AdminAPI -Token $token -DisableCertificateValidation:$cert -DisplayName $PlanName

$plan.ServiceQuotas.settings.Value

I hope this helps and makes sense, please let me know via twitter @steve_hosko if you would like to know any more information around this.

Good Luck

Steve

Windows Azure Pack PowerShell SQL Resource Provider Add Servers & Hosting Group

Windows Azure Pack PowerShell SQL Resource Provider Add Servers & Hosting Group

Here we are again to go over some more commands for automation of Azure Pack, and in this case the addition and or removal of new/old SQL servers to your existing infrastructure, in my previous posts I have gone over the Get, Add, and Removal of Users, Plans, and Subscriptions which are the core components of Azure Pack. For this post I will go over the process of obtaining, the SQL Server Hosting Group, in this case it is a standalone group as Microsoft has not exposed the ability to add a highly available group via PowerShell.

Once we have spent our time working on the SQL Server Hosting group we will move on to adding a new SQL server to the Group, along with the removal of the SQL Server.

Much like the last posts I have included the first 33 lines where we obtain all of the required details for Azure Pack to execute the commands.

# query SQL function

function Invoke-SQL {

param([string] $connstring,[string] $sqlCommand)

$connectionString = $connstring

$connection = new-object system.data.SqlClient.SQLConnection($connectionString)

$command = new-object system.data.sqlclient.sqlcommand($sqlCommand,$connection)

$connection.Open()

$adapter = New-Object System.Data.sqlclient.sqlDataAdapter $command

$dataset = New-Object System.Data.DataSet

$adapter.Fill($dataSet) | Out-Null

$connection.Close()

$dataSet.Tables

}

 

# get WAP connection string

$connstring = (Get-MgmtSvcSetting -Namespace adminsite | where {$_.name -eq ‘ApplicationServicesConnectionstring’}).value

$connstring = $connstring.Split(‘;’)[0] + ‘;’ + $connstring.Split(‘;’)[1] + “; Integrated Security=SSPI”

 

# get WAP websites

$adminsite = (Invoke-SQL -sqlCommand “SELECT value FROM [Config].[Settings] where Namespace = ‘AdminSite’ and Name = ‘Authentication.Fqdn'” -connstring $connstring).value

$TenantSite = (Invoke-SQL -sqlCommand “SELECT value FROM [Config].[Settings] where Namespace = ‘TenantSite’ and Name = ‘Authentication.Fqdn'” -connstring $connstring).value

$AuthSite = (Invoke-SQL -sqlCommand “SELECT value FROM [Config].[Settings] where Namespace = ‘AuthSite’ and Name = ‘Authentication.Fqdn'” -connstring $connstring).value

$windowsauthsite = (Invoke-SQL -sqlCommand “SELECT value FROM [Config].[Settings] where Namespace = ‘WindowsAuthSite’ and Name = ‘Authentication.Fqdn'” -connstring $connstring).value

$AdminAPI = (Invoke-SQL -sqlCommand “SELECT value FROM [Config].[Settings] where Namespace = ‘AdminSite’ and Name = ‘Microsoft.Azure.Portal.Configuration.OnPremPortalConfiguration.RdfeAdminUri'” -connstring $connstring).value

$TenantAPI = (Invoke-SQL -sqlCommand “SELECT value FROM [Config].[Settings] where Namespace = ‘TenantSite’ and Name = ‘Microsoft.Azure.Portal.Configuration.AppManagementConfiguration.RdfeUnifiedManagementServiceUri'” -connstring $connstring).value

$ClientRealm = (Invoke-SQL -sqlCommand “SELECT value FROM [Config].[Settings] where Namespace = ‘AdminSite’ and Name = ‘Authentication.RelyingParty'” -connstring $connstring).value.Split(‘,’)[1].replace(‘”Realm”:”‘,).replace(‘”‘,)

 

# Check is cert is signed

$admin = $adminsite.Split(‘:’)

$cert = !(New-Object System.Net.Security.SslStream((New-Object System.Net.Sockets.TcpClient($admin[1].Replace(‘/’,),$admin[2])).GetStream())).IsSigned

 

# get token

$token = Get-MgmtSvcToken -AuthenticationSite $windowsauthsite -ClientRealm $ClientRealm -Type Windows -DisableCertificateValidation:$cert

 

# Get SQL Server Hosting Group

Get-MgmtSvcSqlServerGroup -AdminUri $AdminAPI -Token $token -DisableCertificateValidation:$cert

 

# Remove SQL Server Hosting Group

$SQLSvrGroupName = ‘Laptop’

$SQLSvrGroup = Get-MgmtSvcSqlServerGroup -AdminUri $AdminAPI -Token $token -DisableCertificateValidation:$cert -GroupName $SQLSvrGroupName

Remove-MgmtSvcSqlServerGroup -AdminUri $AdminAPI -Token $token -DisableCertificateValidation:$cert -ServerGroupId $SQLSvrGroup.GroupId

 

# Add SQL Server Hosting Group

$GRPNAME = ‘Laptop’

Add-MgmtSvcSqlServerGroup -AdminUri $AdminAPI -Token $token -DisableCertificateValidation:$cert -GroupName $GRPNAME -ResourceGovernorEnabled

 

# Get SQL Server

Get-MgmtSvcSqlHostingServer -AdminUri $adminapi -Token $token -DisableCertificateValidation:$cert

 

# Remove SQL Server

$Svrname = ‘desktop-ojt0lg6’

$sqlsvr = Get-MgmtSvcSqlHostingServer -AdminUri $adminapi -Token $token -DisableCertificateValidation:$cert | Where-Object {$_.Name -eq $Svrname}

Remove-MgmtSvcSqlHostingServer -AdminUri $adminapi -token $token -DisableCertificateValidation:$cert -HostingServerId $sqlsvr.ServerId

 

# Add SQL Server

$cred = Get-Credential -UserName ‘SA’ -Message ‘Enter the SA Password for the SQL Server’

$GRPNAME = ‘Laptop’

$SQLSize = 1024

$SQLCPU = 4

$SQLMEM = 4

$SQLIOPS = 2000

$SQLMAXResourcePools = 10

$SQLSVRName = ‘desktop-ojt0lg6’

$svrgrp = Get-MgmtSvcSqlServerGroup -AdminUri $AdminAPI -token $token -DisableCertificateValidation:$cert | Where-Object {$_.GroupName -eq $GRPNAME}

Add-MgmtSvcSqlHostingServer -adminuri $adminapi -Token $token -DisableCertificateValidation:$cert -SqlUser $cred -Name $SQLSVRName -TotalSpaceMB $SQLSize -ServerGroupId $svrgrp.GroupId -NumberOfCpuCores $SQLCPU -TotalMemoryGB $SQLMEM -SupportedIopsPerVolume $SQLIOPS -MaximumResourcePools $SQLMAXResourcePools

This is where I will leave you on this post, but stay turned for future posts around adding the SQL Server hosting group to a Plan.

Good Luck

Steve