Quantcast
Channel: Secure Infrastructure Blog
Viewing all 196 articles
Browse latest View live

Configuring SQL Reporting Services for Remote or different SCOM DW Database

0
0

This article mentions how to configure the SQL Reporting Services for generating reports from a remote SCOM Data Warehouse database (DW DB). This will also be of help for reconfiguring the Reporting Service when there is a change in the SCOM DW DB, Port or Server Name.

In a simple/test single server SCOM installation environment, there is no need to configure these settings since all the databases and reporting services are within the same server itself. However, in production environments where the Reporting Services and the Data Warehouse Database will be in separate environments, we will need to configure the reporting services to make use of the SCOM’s Data Ware house database. This database may be residing on a remote SQL Failover Cluster or part of an AlwaysOn Availability Group.

The below configuration is done on an environment where the Reporting services is installed on one of the SCOM Management Servers and the Data Warehouse DB resides on a remote SQL AlwaysOn Availability Group.

Steps:

Access the SQL Server Reporting Services Web Page and click on Details View.

Now, click on the Data Warehouse Main and select ‘Manage‘ from Dropdown menu.

Under the Data Source Type, Select “Microsoft SQL Server

Under the connection String, Provide the connection string details in the following format:

data source=<DBINSTANCE or Listener Details>;initial catalog=<DW DB Name>;Integrated Security=SSPI

e.g.:

data source=SCOMListener01,14533;initial catalog=OperationsManagerDW;Integrated Security=SSPI

(where SCOMListener01 is the SQL AlwaysOn Listener listening on port 14533 and OperationsManagerDW is the Data Warehouse Database part of the Availability Group)

Select “Windows Integrated Security” under “Connect Using” options:

Click on “Test Connection” to make sure that the connection is successful as shown in the below image.

Click on Apply.

 

This will make sure that the Reporting Services is connected correctly to DW Database.

Also, when the SCOM Database or server is changed or the listening ports are changed due to any security or maintenance reasons the above steps can be performed by providing the new details in the Connection String.


Import Database schema in Azure SQL DB from .SQL files programmatically with SQLCMD

0
0

Introduction – This blog post illustrates the method through which you can import your database schema and tables into empty Azure SQL DB ( PaaS ) programmatically. Currently azure SQL DB support import from BACPAC file in PowerShell and GUI but not from .SQL files.

Assumptions  - Here we assume that you already have .SQL files generated from on-premise data base and ready to upload to azure SQL DB.

Problem statement  - I had a requirement where I needed to import schema and tables into empty azure SQL DB from .SQL files. Currently Azure only provides import of BACPAC files out of the box from PowerShell and GUI and from SQL management studio but requirement here was to do it programmatically every time the ARM deployment script creates new azure SQL DB .

 

Resolution/Workaround – Below steps you should follow

1. Install SQLCMD on the VM/desktop from where you are running the script or deployment. SQLCMD cmdlets are used to deploy SQL files into Azure SQL DB. ODBC driver is required for installing SQLCMD

ODBC driver - http://www.microsoft.com/en-in/download/details.aspx?id=36434

SQLCMD - http://www.microsoft.com/en-us/download/details.aspx?id=36433

2. Save all the SQL files into a folder in local VM.

3. Get the public IP of you local VM/desktop using below code.

$IP = Invoke-WebRequest checkip.dyndns.com
$IP1 = $IP.Content.Trim()
$IP2 = $IP1.Replace("<html><head><title>Current IP Check</title></head><body>Current IP Address: ","")
$FinalIP = $IP2.Replace("</body></html>","")

4. Create a new firewall rule to connect to SQL server.

New-AzureRmSqlServerFirewallRule –FirewallRuleName $rulename  -StartIpAddress $FinalIP -EndIpAddress $FinalIP –servername $SQLservername –Resourcegroupname $resourcegroupname.

5. Save SQL server full name and sqlcmd path into a variable.

$Fullservername = $SQLservername + '.database.windows.net'
$sqlcmd = "C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\110\Tools\Binn\SQLCMD.EXE"

6. Save SQL server credentials and Azure SQL DB name in variables.

$username = “SQLusername”

$password = “SQLpassword”

$dbname = “databasename”

7. Run the below command for each SQL files if u want to import it sequentially.

& $sqlcmd -U $username -P $password -S $Fullservername -d $dbname -I -i "C:\SQL\file1.sql"

& $sqlcmd -U $username -P $password -S $Fullservername -d $dbname -I -i "C:\SQL\file3.sql"

& $sqlcmd -U $username -P $password -S $Fullservername -d $dbname -I -i "C:\SQL\filen.sql"

 

NOTE – You can accumulate  all the code and use it in deployment scripts along with functions and error logging

 

Thanks folks, Hope it is useful.

happy blogging

Configuring SQL Reporting Services for Remote or different SCOM DW Database

0
0

This article mentions how to configure the SQL Reporting Services for generating reports from a remote SCOM Data Warehouse database (DW DB). This will also be of help for reconfiguring the Reporting Service when there is a change in the SCOM DW DB, Port or Server Name.

In a simple/test single server SCOM installation environment, there is no need to configure these settings since all the databases and reporting services are within the same server itself. However, in production environments where the Reporting Services and the Data Warehouse Database will be in separate environments, we will need to configure the reporting services to make use of the SCOM’s Data Ware house database. This database may be residing on a remote SQL Failover Cluster or part of an AlwaysOn Availability Group.

The below configuration is done on an environment where the Reporting services is installed on one of the SCOM Management Servers and the Data Warehouse DB resides on a remote SQL AlwaysOn Availability Group.

Steps:

Access the SQL Server Reporting Services Web Page and click on Details View.

Now, click on the Data Warehouse Main and select ‘Manage‘ from Dropdown menu.

Under the Data Source Type, Select “Microsoft SQL Server

Under the connection String, Provide the connection string details in the following format:

data source=<DBINSTANCE or Listener Details>;initial catalog=<DW DB Name>;Integrated Security=SSPI

e.g.:

data source=SCOMListener01,14533;initial catalog=OperationsManagerDW;Integrated Security=SSPI

(where SCOMListener01 is the SQL AlwaysOn Listener listening on port 14533 and OperationsManagerDW is the Data Warehouse Database part of the Availability Group)

Select “Windows Integrated Security” under “Connect Using” options:

Click on “Test Connection” to make sure that the connection is successful as shown in the below image.

Click on Apply.

 

This will make sure that the Reporting Services is connected correctly to DW Database.

Also, when the SCOM Database or server is changed or the listening ports are changed due to any security or maintenance reasons the above steps can be performed by providing the new details in the Connection String.

Tool to calculate 90th percentile for common transactions on a visual studio Load test run

0
0

Requirement
As a performance tester, I need to calculate 90th percentile for common transactions, from the Transaction summary of Load Test Runs

Problem Statement
In the visual studio Test transaction summary report, we have scenarios where in same transaction reported multiple times with
different response times(due to load and application behavior).
Example: Login transaction which is used as first transaction in every scenario, can report different response times at different scenarios.
In such case we usually calculate 90th percentile or AVG of common login transactions and report to developers or customer.
This will ensure consistency and provide accurate results.

Problem Solution:
Below is the generic utility, a Sql stored procedure that will automatically calculate 90th percentile for all common
transactions, on various response times(Avg/90thpercentile)

Steps to Execute the Store procedure
1. Connect to LoadTest results database. Create & Execute below Store Procedure
2. Execute below TSQL to create Stored Procedure.

Create Procedure Calc90thPercentileForCommonTransactions @loadtestid int
As
Begin

---- Get visual studio test results to Temptable
Select * into #TempTable
from
(select distinct LTC.TestCaseName,LTTSD.LoadTestRunId, WLTT.TransactionName, LTTSD.Percentile90,
PERCENTILE_CONT ( 0.9 ) WITHIN GROUP ( ORDER BY LTTSD.Percentile90 )
OVER ( partition by WLTT.TransactionName ) as 'CalculatedPercentile90th'
from LoadTestTransactionSummaryData LTTSD
Join WebLoadTestTransaction WLTT on LTTSD.TransactionId = WLTT.TransactionId , LoadTestCase LTC
where LTTSD.LoadTestRunId = @loadtestid
and LTTSD.LoadTestRunId = WLTT.LoadTestRunId
and LTC.TestCaseId = WLTT.TestCaseId and LTTSD.TransactionId = WLTT.TransactionId
and LTC.LoadTestRunId = @loadtestid) as result;

---- Calculate 90th percentile for commonTrasactions
WITH DUP
AS (
SELECT TransactionName
FROM #TempTable
GROUP BY TransactionName
HAVING COUNT(1) > 1)

SELECT t.TestCaseName,t.TransactionName, t.Percentile90 as '90thPercentileFromTestResult',t.CalculatedPercentile90th as '90thPercentileForCommonTransaction',
CASE
WHEN DUP.TransactionName IS NOT NULL
THEN 'Yes'
ELSE 'No'
END AS IsCommonTransaction,
CASE
WHEN DUP.TransactionName IS NOT NULL
THEN CalculatedPercentile90th
ELSE Percentile90
END AS Consolidated90thPercentileToReport
FROM #TempTable T
LEFT JOIN DUP ON T.TransactionName = DUP.TransactionName;

End

3. Stored procedure execution

Exec Calc90thPercentileForCommonTransactions @loadtestid
where loadtestid is the runid of the test.
Example:  Exec Calc90thPercentileForCommonTransactions 1555

4. Below is the result of store procedure
TestCaseName: Name of the TestCase
TransactionName: Name of the Transaction
90thPercentileFromTestResult: 90thPercentile Response from Transaction summary of the Test(Additional Info for debugging purpose)
90thPercentileForCommonTransaction: Calculated 90thPercentile of all Common Transactions on the Response time. (Additional Info for debugging purpose)
IsCommonTransaction: 'Yes' in case if it is common transaction(Present more than once), 'No' in case if it is not a common Transaction(Unique)(Additional Info for debugging purpose)
Consolidate90thPercentileToReport: Final Response time to developers or customer which has 90thpercentile calculated value for all Common Transactions(Present more than once)

Configuring SQL Reporting Services for Remote or different SCOM DW Database

0
0

This article mentions how to configure the SQL Reporting Services for generating reports from a remote SCOM Data Warehouse database (DW DB). This will also be of help for reconfiguring the Reporting Service when there is a change in the SCOM DW DB, Port or Server Name.

In a simple/test single server SCOM installation environment, there is no need to configure these settings since all the databases and reporting services are within the same server itself. However, in production environments where the Reporting Services and the Data Warehouse Database will be in separate environments, we will need to configure the reporting services to make use of the SCOM's Data Ware house database. This database may be residing on a remote SQL Failover Cluster or part of an AlwaysOn Availability Group.

The below configuration is done on an environment where the Reporting services is installed on one of the SCOM Management Servers and the Data Warehouse DB resides on a remote SQL AlwaysOn Availability Group.

Steps:

Access the SQL Server Reporting Services Web Page and click on Details View.

Now, click on the Data Warehouse Main and select 'Manage' from Dropdown menu.

Under the Data Source Type, Select "Microsoft SQL Server"

Under the connection String, Provide the connection string details in the following format:

data source=<DBINSTANCE or Listener Details>;initial catalog=<DW DB Name>;Integrated Security=SSPI

e.g.:

data source=SCOMListener01,14533;initial catalog=OperationsManagerDW;Integrated Security=SSPI

(where SCOMListener01 is the SQL AlwaysOn Listener listening on port 14533 and OperationsManagerDW is the Data Warehouse Database part of the Availability Group)

Select "Windows Integrated Security" under "Connect Using" options:

Click on "Test Connection" to make sure that the connection is successful as shown in the below image.

Click on Apply.

 

This will make sure that the Reporting Services is connected correctly to DW Database.

Also, when the SCOM Database or server is changed or the listening ports are changed due to any security or maintenance reasons the above steps can be performed by providing the new details in the Connection String.

Migrating Performance Point To New SharePoint Site with different path

0
0

Introduction

When you migrate SharePoint site to a different location that is in a different site structure with a different path the SharePoint content will be migrated with the content itself. But some components will not work as expected because the site URL structure is different.

One of these components is Performance Point content.

Performance point content contains relevant links to Performance Point Connections in the Connections library. Because the path is changed this relevant link will be not valid. Also the Dashboards will have Reference links to other Performance Point contents like KPIs, Reports, Score cards, ...etc. These Reference links will not be valid also.

This will cause the performance point web parts in the dashboard pages to show error that the data source does not exist or you don't have a permission.

Migration Steps

The migration procedure consists of two major steps:

1. Exporting Performance Point content and connections in a Dashboard Designer Workspace (ddwx) file from the Source environment.

2. Import the ddwx file in the destination environment.

Export Dashboard Designer Workspace (ddwx) file from Source environment

1. Launch PerformancePoint dashboard designer.

2. Click on the PerformancePoint content list.

3. Select all the items in the list (Ctrl-A)

Export Content_1

 

4. Click on “Add Items” button on the ribbon under workspace section.

Export Content_2

 

5. All Items should be Added to the workspace area as shown in the image below

Export Content_3

6. Apply the steps 2 to 5 for all the Performance Point Content Lists and Connections Lists

7. Save the workspace by clicking on the office button Save workspace as.

        Export Content_4

 

Import Dashboard Designer Workspace (ddwx) file in the destination environment

In this step you will need to import the ddwx file to your destination environment.

1. Launch PerformancePoint dashboard designer.

2. Click on Import Items

Import Content_1

 

3. Map Performance Point Items to the corresponding item in your destination environment.

Import Content_2

Import Content_3

 

4. Select "Import data sources that already exists in the destination" Click On Next

Import Content_4

 

5. Wait until import is completed. Make Sure that all items are updated with no errors.

Import Content_6

Issues

I faced an issue one time after I finished the migration process. I found that there are some reports and KPIs that have the connections links still not corrected.  I figured out that there were more than one performance point reports and KPIs that has the same name in the same content list. In this case I found that one of the reports that has the same name is updated the other report was not updated. It was the same case also for the KPIs.

In this case I had to recreate the reports and KPIs that was not updated.

 

 

Configuring SQL Reporting Services for Remote or different SCOM DW Database

0
0

This article mentions how to configure the SQL Reporting Services for generating reports from a remote SCOM Data Warehouse database (DW DB). This will also be of help for reconfiguring the Reporting Service when there is a change in the SCOM DW DB, Port or Server Name.

In a simple/test single server SCOM installation environment, there is no need to configure these settings since all the databases and reporting services are within the same server itself. However, in production environments where the Reporting Services and the Data Warehouse Database will be in separate environments, we will need to configure the reporting services to make use of the SCOM's Data Ware house database. This database may be residing on a remote SQL Failover Cluster or part of an AlwaysOn Availability Group.

The below configuration is done on an environment where the Reporting services is installed on one of the SCOM Management Servers and the Data Warehouse DB resides on a remote SQL AlwaysOn Availability Group.

Steps:

Access the SQL Server Reporting Services Web Page and click on Details View.

Now, click on the Data Warehouse Main and select 'Manage' from Dropdown menu.

Under the Data Source Type, Select "Microsoft SQL Server"

Under the connection String, Provide the connection string details in the following format:

data source=<DBINSTANCE or Listener Details>;initial catalog=<DW DB Name>;Integrated Security=SSPI

e.g.:

data source=SCOMListener01,14533;initial catalog=OperationsManagerDW;Integrated Security=SSPI

(where SCOMListener01 is the SQL AlwaysOn Listener listening on port 14533 and OperationsManagerDW is the Data Warehouse Database part of the Availability Group)

Select "Windows Integrated Security" under "Connect Using" options:

Click on "Test Connection" to make sure that the connection is successful as shown in the below image.

Click on Apply.

 

This will make sure that the Reporting Services is connected correctly to DW Database.

Also, when the SCOM Database or server is changed or the listening ports are changed due to any security or maintenance reasons the above steps can be performed by providing the new details in the Connection String.

Efficient way to retrieve work item details from a Linked Query using TFS API

0
0

Requirement:
Retrieve WorkItems details of a linked query using TFS API

Problem Statement:
We can query a linked Query item using TFS Workitem Query language, But the challenge with linked query is, it displays only
SourceId and TargetId along with LinkTypeId. It does not display other important fields like title, status, description e.t.c

In Order to display the same we will have
a. Get all Source and Target Id(Linked Work Item Id's) using a Linked Query to a datastructure
b. Now iterate over each Id and make API call to get detail info for each item.
This involves too many calls to TFS server.

For Example: To Query bugs that have user stories. We need
a. Execute a linked query to get all Source and Target Id(Linked Work Item Id's)
b. Execute a flat query to bug details(Bug title, status, Prority etc.) for each Target Id.
Lets say if the query has 10,000 bugs we will have to make 10,000 calls to TFS to get bug details for each target id.

Solution:
We can write one single query passing all Target Ids and get workitem details in one go.

Below is Code snippet to retreive work item details using a Linked Query (i.e Bug details of bugs that has user stories)

// Linked query to display Source(Bug id) and their respectived TargetId(Linked user story work item id) for Project "My Project"
Query query = new Query(_store, string.Format(
"SELECT [System.Id] FROM WorkItemLinks  WHERE ([Source].[System.TeamProject] = 'My Project'  AND  AND [Source].[System.WorkItemType] = 'Bug') And ([System.Links.LinkType] <> '') And ([Target].[System.WorkItemType] = 'User Story') ORDER BY [System.Id] mode(MustContain)"));

 

// Get list of work item Id for which we want to retrieve more detailed information like bug title, status, description
int[] ids = (from WorkItemLinkInfo info in wlinks
select info.TargetId).ToArray();

// Use flat query to get detailed workitem info for list of workitem ids
var DetailedInfoQuery = new Query(_store, mydetails, ids);
WorkItemCollection workitems = DetailedInfoQuery .RunQuery();

foreach (WorkItem wi in workitems)
{
WorkItemType worktype = wi.Type;
string WorkItemType = worktype.Name;
string id = wi.Id;
string bugtile = wi.Title;
}


Configuring SQL Reporting Services for Remote or different SCOM DW Database

0
0

This article mentions how to configure the SQL Reporting Services for generating reports from a remote SCOM Data Warehouse database (DW DB). This will also be of help for reconfiguring the Reporting Service when there is a change in the SCOM DW DB, Port or Server Name.

In a simple/test single server SCOM installation environment, there is no need to configure these settings since all the databases and reporting services are within the same server itself. However, in production environments where the Reporting Services and the Data Warehouse Database will be in separate environments, we will need to configure the reporting services to make use of the SCOM's Data Ware house database. This database may be residing on a remote SQL Failover Cluster or part of an AlwaysOn Availability Group.

The below configuration is done on an environment where the Reporting services is installed on one of the SCOM Management Servers and the Data Warehouse DB resides on a remote SQL AlwaysOn Availability Group.

Steps:

Access the SQL Server Reporting Services Web Page and click on Details View.

Now, click on the Data Warehouse Main and select 'Manage' from Dropdown menu.

Under the Data Source Type, Select "Microsoft SQL Server"

Under the connection String, Provide the connection string details in the following format:

data source=<DBINSTANCE or Listener Details>;initial catalog=<DW DB Name>;Integrated Security=SSPI

e.g.:

data source=SCOMListener01,14533;initial catalog=OperationsManagerDW;Integrated Security=SSPI

(where SCOMListener01 is the SQL AlwaysOn Listener listening on port 14533 and OperationsManagerDW is the Data Warehouse Database part of the Availability Group)

Select "Windows Integrated Security" under "Connect Using" options:

Click on "Test Connection" to make sure that the connection is successful as shown in the below image.

Click on Apply.

 

This will make sure that the Reporting Services is connected correctly to DW Database.

Also, when the SCOM Database or server is changed or the listening ports are changed due to any security or maintenance reasons the above steps can be performed by providing the new details in the Connection String.

Arabic Language Pack for SCSM Self Service Portal

0
0

Hi All,

one of the challenges we face in our region is providing users with their native Self Service Portal Language. Since Arabic is not part of built-in languages shipped with Service Manager Self Service Portal, we were looking into different options such as having 3rd party portal but not now 🙂

We spent some time looking into the files that SSP is using and located the language resource files which you can be used not only for Arabic, but for any other language that is not available in SCSM Self Service Portal.

In this post we will show you 2 things. First, how to filter the languages and keep required ones instead of having all languages available in the portal. Second, together we will configure Arabic Language pack for System Center Service Manager Self Service Portal to be such as below screenshot.

First: Show preferred languages (Remove unnecessary ones)

When you click on the language settings (Top Right  Corner) in Self Service Portal, by default 10 or more languages appear to select including Chinese, French, Japanese, ... etc. to make it easier for users, it is preferred to show them the languages that they could use only.Follow the procedure below to make that happen:

0- <<BACKUP BACKUP BACKUP>>

1- Browse to (C:\inetpub\wwwroot\SelfServicePortal\Views\Shared) folder

2- Edit (_Layput.cshtml) file using notepad or any other tool. (run as administrator) (Don't forgot to backup the file and saving it somewhere else before editing it)

3- Search the file for "<ul class=lang_menu ..."

4- Remove the lines for necessary languages and keep the ones you want your users to see. Remember to remove the whole line (from <li ------- to -------- </li>)

I removed all languages except English, French and Dutch

5- Refresh your portal ...

Completed .... lets see how can we configure a new language pack 🙂

 


 

Second: Configure Arabic Language pack for SSP 

As mentioned before, this is not limited to Arabic as you can use it to configure any language you want but in this example we will talk about configuring Arabic language pack. follow the procedure below

1- Browse to (C:\inetpub\wwwroot\SelfServicePortal\Views\Shared) folder

2- Edit (_Layput.cshtml) file using notepad or any other tool. (run as administrator) (Don't forgot to backup the file before editing it)

3- Add the following line inside <ul class=”language_mune …

<li value="ar-JO" tabindex="12">Arabic</li>

Note: ar-JO???? this is the Arabic Language code of  Jordan. For more info about different language code for countries read https://www.andiamo.co.uk/resources/iso-language-codes

 

4- Browse to folder (C:\inetpub\wwwroot\SelfServicePortal\App_GlobalResources)

5- Copy file (SelfServicePortalResources.en.resx) to your local machine (where Arabic keyboard supported)

6- Rename file to be (SelfServicePortalResources.ar.resx)

7- Edit the file using any tool (such as notepad++)

8- In the file you can find all words used ... Translate it into Arabic ... or download this translated file SelfServicePortalResources.ar_

 

 

9- Upload the file to the folder (C:\inetpub\wwwroot\SelfServicePortal\App_GlobalResources)

 

10 - Refresh your browser and select Arabic Language from Language Settings tab.

 

NOTE: if you don't have any Service Offering with (Arabic) language selected then you won't see any offering. at least create one service offering and select language as Arabic then add some requests offering for it

Hope this would be useful ... Thanks for reading

Mohamad Damati

Intune/EMS enrollements (ADFS scenario)

0
0

Many of customers are facing problems on Intune enrollment with Android devices; it can be:

  1. Missing a certificate : you need to ensure that the all the certificate chain is installed on the ADFS proxy/servers (check it here : https://www.ssllabs.com/ssltest)
  2. When enrolling in the company portal the authentication doesn’t work:
    • Check the TLS version on the ADFS proxy or your HLB
    • Check the Cypher Suite on your HLB

Then it will work,

Don’t forgot that some devices are not compatible with Intune (till now 09.2018):

Resolving WSUS Performance Issues

0
0

Introduction

I have recently come across multiple customers that are having issues with a High IIS Worker Process, causing the Servers to flatline,  so I wanted to take some time here to run you through all of the steps that you can follow to remediate this issue.

 

 

So what is the Issue exactly?

Even though it may seem daunting trying to figure out what is causing the headaches, the issue and solutions are quite easy

It is important to understand that it is usually a combination of 2 things:

  1. Regular WSUS Maintenance is not being done, in particular declining of Superseded Updates
  2.  Incorrect Configuration on the SCCM Server hosting the SUP Role (we will get into that a bit below)

Important Considerations

It is important to understand that even though the Server is showing to be flatlining on CPU, this is not a CPU issue. Adding more cores\Processors will not resolve the issue that you are experiencing. We need to delve deeper into the issue, to resolve the underlying issue.

 

1.Regular WSUS Maintenance not being done.

This is a Large part of the issue that customers are experiencing. It is important that we understand what we mean with this.

(Have a read through This amazing Blog from Meghan Stewart | Support Escalation Engineer to help you setup your WSUS Maintenance. It has all the required information on When, Why and How to implement your WSUS Maintenance, as well as having a great PowerShell script to help..)

 

Superseded Updates - Update that has been replaced by another newer update, and is no longer relevant. Yes these updates to report that they are "needed" however, they have been replaced by a newer update and are just taking up space\CPU time.

Example:

Let us take an example of a 1000 Client small network, running Multiple versions of Server\Client OS (Windows 7, Windows 8.1, Windows 10, Server 2012 R2, Server 2016), having 1 or more SUP's

Each client will scan against the SUP (Software Update Point) Catalog regularly, to determine what updates are available, how compliant it is, and if any Updates are needed.

When clients scan against WSUS they scan all updates that are not declined or obsolete. If 25% of your updates are superseded (for instance) that is 25% wasted CPU time from the client's machine as well as on the Server that they are scanning against.

So you need to ensure that you are regularly cleaning up the WSUS Server as per the Article above.

 

2. Incorrect Configuration on the SCCM Server hosting the SUP Role

An important Step is missed by a lot of customers, that is configuring your WSUS AppPool Correctly.

 

The AppPool memory in a lot of cases is left at the Default 1.9GB of memory. This is not sufficient if you are managing a large amount of clients, and will need to be increased.

Note: This is reserved memory that you are allocating, so ensure that you have catered for it in your planning

Open your IIS Manager App - Expand Server name - Application Pools.

Right-Click on the WsusPool - Advanced Settings

First thing you can do is to change your Queue Length from 1000 to 2000 (environment depending. Queue length is Maximum number of HTTP.sys requests that will queue for the App Pool, before the system will return 503 - Service Unavailable Error)

Secondly the Private Memory needs to be changed to a Minimum of 4GB instead of the 1.9GB default.

Once Completed Recycle the AppPool.

My Server may be flatlining so bad that I cannot open WSUS, or WSUS Cleanup is being done, so what now?

The last step that you can take in an extreme situation is to temporarily kick the clients off the WSUS Server, so that you can complete the Modifications to the WsusPool\Perform WSUS Cleanup.

Temporarily kick the clients

We are going to be creating a new AppPool and changing the website bindings so that we can access the WSUS in order to perform the cleanup.

Note: During this step, your clients will not be able to connect to your WSUS instance.

Open your IIS Manager App - Expand Server name - Application Pools.

Right-Click on the Application Pools - Add Application Pool

Once you have created the AppPool, we need to change the Website over to the Pool First

 

Now our Next Step is to change the Bindings and assign a different port number to the HTTP Connection for WSUS, so that the clients are unable to scan against it, thereby freeing up the memory for us.

Under IIS Manager App - Expand Server name - Sites - WSUS Administration

Right click - Edit Bindings

Now Assign a different Port Number (i.e. 1234 )

Once this is done, you will need to restart the Website

While still in IIS Manager App - Expand Server name - Sites - WSUS Administration - Restart Website

Now when you connect to WSUS, select the custom new port

That should allow you now to be able to run the Cleanup and re-indexing of the DB for WSUS.

Once you have completed this, make sure to change the Bindings\Pool back to what it was before so that the clients can now start scanning again.

 

Conclusion

As long as the correct configuration is applied in the environment, and the regular maintenance in place, you should not have any further WSUS Performance issues.

Creating symbolic links with PowerShell DSC

0
0

Background

In an Azure Windows VM you automatically get a temporary disk drive mapped to D: (on Linux it's mapped to /dev/sdb1). It is temporary because the storage is assigned from the local storage on the physical host. So if your VM is re-deployed (due to host updates, host failures, resizing, etc.), the VM is recreated on a new host and the VM will be assigned a new temporary drive on the new host. The data on the temporary drive is not migrated, but the OS disk is obviously preserved from the vhd in your storage account or managed-disk.

The problem

In this specific scenario, the customer had a 3rd party legacy application that reads and writes from two directories in the D:\ drive. The directories paths were hard-coded in the application and were a couple of gigabytes in size, so copying them to the temporary drive each time the VMs were deployed would be time and resource consuming.

Choosing a solution

After thorough testing of course, we decided to create two symbolic links from the D:\ drive to the real directories in the OS disk (where the directories were already present as part of the image). The symbolic-links creation would be accomplished with either the mklink command, or with the New-Item cmdlet in PowerShell 5.x.

Of course there are other methods overcoming this challenge, such as switching the drive letters with a data-disk and moving the PageFile to the other drive letter. But we decided that the symbolic-links approach would be faster and wouldn't require an additional data-disk, and with it, additional costs.

The implementation

Since the creation of the symbolic-links would need to happen every time the VM is created (and redeployed), we ended up adding a PowerShell DSC extension to the VM in the ARM template and since there were no built-in DSC Resources in the OS, nor in the DSC Resource Kit in the PowerShell gallery that configures symbolic-links, we wrote a (quick-and-dirty) PowerShell module and the resource to create them.

Creating the module structure and the psm1 and schema.mof files is pretty easy when you're using the cmdlets from the xDSCResourceDesigner module:

Install-Module -Name xDSCResourceDesigner

$ModuleName = 'myModule'
$ResourceName = 'SymbolicLink'
$ModuleFolder = "C:\Program Files\WindowsPowerShell\Modules\$ModuleName"

New-xDscResource -Name $ResourceName -Property @(
    New-xDscResourceProperty -Name Path -Type String -Attribute Key
    New-xDscResourceProperty -Name TargetPath -Type String -Attribute Write
) -Path $ModuleFolder

cd $ModuleFolder
New-ModuleManifest -Path ".\$ModuleName.psd1"

The contents of the .psm1 resource file C:\Program Files\WindowsPowerShell\Modules\myModule\DSCResources\SymbolicLink\SymbolicLink.psm1 should contain the three *-TargetResource functions (Get, Set and Test):

function Get-TargetResource {
    [CmdletBinding()]
    [OutputType([System.Collections.Hashtable])]
    param (
        [parameter(Mandatory = $true)]
        [System.String]
        $Path
    )

    Write-Verbose "Getting SymbolicLink for $Path"

    $Root = Split-Path -Path $Path -Parent
    $LinkName = Split-Path -Path $Path -Leaf
    $TargetPath = $null

    $link = Get-Item -Path (Join-Path -Path $Root -ChildPath $LinkName) -ErrorAction SilentlyContinue
    if($link -and  $link.LinkType -eq 'SymbolicLink') { $TargetPath = $link.Target[0] }
    
    @{Path = $Path; TargetPath = $TargetPath}
}


function Set-TargetResource {
    [CmdletBinding()]
    param
    (
        [parameter(Mandatory = $true)]
        [System.String]
        $Path,

        [System.String]
        $TargetPath
    )

    Write-Verbose "Creating a SymbolicLink from $Path to $TargetPath"

    $Root = Split-Path -Path $Path -Parent
    $LinkName = Split-Path -Path $Path -Leaf
    Set-Location -Path $Root
    New-Item -ItemType SymbolicLink -Name $LinkName -Target $TargetPath | Out-Null
}


function Test-TargetResource {
    [CmdletBinding()]
    [OutputType([System.Boolean])]
    param (
        [parameter(Mandatory = $true)]
        [System.String]
        $Path,

        [System.String]
        $TargetPath
    )

    Write-Verbose "Testing SymbolicLink for $Path"

    $current = Get-TargetResource -Path $Path
    return (($current.Path -eq $Path) -and ($current.TargetPath -eq $TargetPath))
}

Export-ModuleMember -Function *-TargetResource

And in the configuration document, remember to import the DSC resources from the module:

configuration Main {

    Import-DscResource -ModuleName PSDesiredStateConfiguration
    Import-DscResource -ModuleName myModule

    node localhost {

        SymbolicLink 'INPUT_DIR' {
            Path       = 'D:\INPUT_DIR'
            TargetPath = 'C:\PathTo\myLegacyApp\INPUT_DIR'
        }
        
        SymbolicLink 'OUTPUT_DIR' {
            Path       = 'D:\OUTPUT_DIR'
            TargetPath = 'C:\PathTo\myLegacyApp\OUTPUT_DIR'
        }
    }
}

Now, to create the zip file containing the configuration document and all required modules:

# Create the zip package
Publish-AzureRmVMDscConfiguration .\myDSC.ps1 -OutputArchivePath .\myDSC.zip

And upload it to the blob container (used in the ARM template):

# Variables
$storageAccountName = 'statweb'
$resourceGroupName = 'rg-statweb'

# Login to Azure
Login-AzureRmAccount

# Get the Storage Account authentication key
$keys = Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName

# Create a Storage Authentication Context
$context = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $keys.Item(0).value

# Upload the file to the blob container
Set-AzureStorageBlobContent -Context $context -Container dsc -File .\myDSC.zip -Blob myDSC.zip

Conclusion

There are usually several methods to accomplish a single task, and you should take under consideration all aspects and constrains, because one can be more effective than another.

And if you don't already feel comfortable scripting with PowerShell, you should hurry and Start-Learning. There are a ton of excellent resources out there, but if you prefer a face-to-face in-class learning experience, and have a Premier contract, contact your Technical Account Manager (TAM) for more information on our PowerShell Workshop series.



HTH,

Martin.

Unable to start SCOM ACS collector service – Event ID 4661

0
0

Problem Description and Symptoms:

The Operations Manager Audit Collections Service is not starting with the following error and event Id:

Event ID 4661 Error :
AdtServer encountered the following problem during startup:
Task: Load Certificate
Failure: Certificate for SSL based authentication could not be loaded
Error:
0x80092004
Error Message:
Cannot find object or property.

1

Solution:

1. Ensure that the certificate exists on the Management Server acting as ACS collector and is valid (If not, issue one for the Collector and import it in the Local Computer –> Personal –>Certificates Store)

image

2. Open CMD as Administrator

3. Go to the following path “%systemroot%\system32\Security\AdtServer”

4. Execute the following: adtserver.exe -c and choose the certificate to be used (This command will allow you to bind the certificate to the service)

image

5. Start the Audit Collection Service by executing: net start adtserver

image

6. Check the collector health

image

In which scenarios certificates are needed and why?

ACS requires mutual authentication between Forwarder(s) and Collector(s) servers, prior to the exchange of information, to secure the authentication process is encrypted between these two. When the Forwarder and the Collector reside in the same Active Directory domain or in Active Directory domains that have established trust relationships, they will use Kerberos authentication mechanisms provided by Active Directory.

But when the Forwarder and Collector are in different domains with no trust relationship, other mechanisms must be used to satisfy the mutual authentication requirement in a secure way. Here comes the use of certificates to ensure that authentication between these 2 parties (Forwarder and Collector) can take place, thus start exchanging information between them.

Monitoring Application Deployment Failures in Configuration Manager

0
0

Background

One of the key features of System Center Configuration Manager is Application deployment. Most of our enterprise customers have invested heavily in their administrative time and skills in managing the deployment of applications to thousands of machines within their environment.

 

The Scenario

With numerous applications deployed to collections, my enterprise customer found difficulty in tracking application failures across their environment. The issue they encountered was that most reports they attempted only provided the ability to report on the application deployment creation date. This is a limitation when applications are created and deployed months ago but remain active in a large environment.

 

The Solution

After much deliberation, we concluded that the customer needed the ability to report on machines that failed their application deployment in the last week (or timeline they specified) regardless of when the application deployment was initially created.

With their objectives in mind we put together the below solution for use in their environment, at their discretion of course:
•    Using a set of application deployment report(s) support the administrators in monitoring failed application deployments.
•    And lastly, they needed a means to remove direct membership of machines in the collections targeted by application deployment on a weekly basis.

 

The Implementation

 

1.    App Portal - Application deployment failure report

•    Administrators initiate their monitoring\troubleshooting of failed application deployments for their environment by reviewing this report.
•    By default, the report is configured to provide a list of machines that last attempted to install an application in the past 7 days. The timeline can be modified with the report parameters.
•    This report provides an overall view of application failures, most importantly the report provides:
1-    information on when a machine last attempted the installation of an application
2-    error code of the failed deployment
•    Top 5 Application Failures: Administrators can easily identify the top 5 applications that failed deployment and focus their remediation efforts.
•    Count of error codes: Administrators can easily identify the top 5 errors across applications.
•    List of Failed Applications: Administrators can identify the machine name, collection name, error code and most importantly the last time a machine attempted to install an application in the Enforcement Time column.

image

 

2.    App portal – Application deployment status

•    This is a widely available report that provides overall information on the application deployment status. The limitation with this report however is that it only provides information to administrators for application deployments created within specified date range specified in the input parameters.
image

 

3.    Remove machine direct membership from collections (weekly\monthly):

The customer required a method to remove machines with direct membership from multiple collections that are targeted with applications and decided that they would incorporate a PowerShell script to achieve this on a weekly\monthly basis.

The best approach for the customer was to populate an input text file where they can manage the names of Application Portal collections they chose to target. The PowerShell script would be scheduled by Task Scheduler.

 

•    PowerShell Script to remove direct membership from collection:

#############################################################################
#
# Purpose : This script removes collection direct membership from a list
#
#############################################################################

# Load Configuration Manager PowerShell Module
Import-module ($Env:SMS_ADMIN_UI_PATH.Substring(0,$Env:SMS_ADMIN_UI_PATH.Length-5) + '\ConfigurationManager.psd1')

# Get SiteCode
$SiteCode = Get-PSDrive -PSProvider CMSITE
Set-location $SiteCode":"

# Define Input File
$script_parent = Split-Path -Parent $MyInvocation.MyCommand.Definition
$inputfile = $script_parent + "\InputFile.txt"
$list = get-content $inputfile

# Define Logfile Parameters
# Logfile time stamp
$LogTime = Get-Date -Format "dd-MM-yyyy_hh-mm-ss"
# Logfile name
$Logfile = $script_parent + "\CMRemDirectMem_"+$LogTime+".log"
Function LogWrite
{
Param ([string]$logstring)
Add-content $Logfile -value $logstring
}

# Remove Collection Direct Membership
ForEach ($CollectionName In $list)
{
Remove-CMDeviceCollectionDirectMembershipRule -CollectionName $CollectionName -ResourceName * -Force -ErrorAction SilentlyContinue
LogWrite "$LogTime | $CollectionName"
Echo "$LogTime | $CollectionName"
}

•    The PowerShell script uses an InputFile.txt to specify application deployment collections.
•    The InputFile.txt will need to be created in the same folder as the PowerShell Script as the script will reference the parent folder.
•    An output logfile is created as the PowerShell script executes.

image

•    Example of the Task Scheduler configuration created for the Basic Task Action tab:

image


Program/script:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe

Add Arguments (optional):
"D:\SOURCES\PowerShell\Collections_RemoveDirectMembership\Collections_RemoveDirectMembership.ps1"

 

4.    App portal – List of collections modified

Once the PowerShell script has executed, the below report is scheduled to run. This report records the list of collections that were modified by the PowerShell script. I have included the Date input parameters with a default offset of 1 day.
image

 

5.    App portal – List of machines in collection with direct membership

This is the second report also scheduled to run after the PowerShell script. This report displays the list of machines that still have direct membership. If this occurs, Administrators can troubleshoot further.
image

 

The Conclusion

Monitoring of Application Deployment for most organizations is a time-consuming task therefore in creating a process that is easy to follow, it will always benefit administrators allowing them to monitor and maintain large environments. I hope the information shared in the above scenario is helpful.

The reports can be downloaded below.

RDL

Note, I have split the reports into SSRS 2016 or later and SSRS 2014 or earlier. The SSRS 2014 reports do not contain any charts displayed in the screenshots above.

 

Disclaimer – All scripts and reports are provided ‘AS IS’
This sample script is not supported under any Microsoft standard support program or service. This sample script is provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of this sample script and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of this script be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use this sample script or documentation, even if Microsoft has been advised of the possibility of such damages.


Automating the clean-up of Configuration Manager Maintenance Collections

0
0

Background

Most organizations using System Center Configuration Manager implement collections configured for maintenance tasks. Administrators generally monitor these collections on a weekly\monthly schedule and in some instances are required to delete the machines within these collections, for example: collections containing Obsolete Clients.

Scenario

My customer was looking for a method to streamline their weekly\monthly maintenance tasks where they manually delete machines from multiple collections.

Solution

Using System Center Configuration Manager PowerShell cmdlets a script was created to:
•    Read an input text file that is populated with specific collection names.

•    Automatically delete all machines from the collections specified in the input text file.
Note, use this script with extreme caution as machines are deleted from the SCCM database therefore always ensure the correct collection names are populated in the input text file. Test this script in your lab environment to ensure it works as desired.

•    Invoke a collection membership update once machines were deleted.

•    Output a logfile that records the collection names and the machines that were deleted, respectively.

 

PowerShell Script:

#           This script performs the following actions:
#            - CAUTION, deletes machines from SCCM in specified collections
#            - Updates the collection membership
#            - Creates a Logfile with Date, Time, Collection Name and Machine Names
#
#           This script does NOT:
#            - Remove the collection rules\query

# Load Configuration Manager PowerShell Module
Import-module ($Env:SMS_ADMIN_UI_PATH.Substring(0,$Env:SMS_ADMIN_UI_PATH.Length-5) + '\ConfigurationManager.psd1')

# Get SiteCode
$SiteCode = Get-PSDrive -PSProvider CMSITE
Set-location $SiteCode":"

# Define Input File
$script_parent = Split-Path -Parent $MyInvocation.MyCommand.Definition
$inputfile = $script_parent + "\InputFile.txt"
$list = get-content $inputfile

# Define Logfile Parameters
# Logfile time stamp
$LogTime = Get-Date -Format "dd-MM-yyyy_hh-mm-ss"
# Logfile name
$Logfile = $script_parent + "\CMRemoveDevice_"+$LogTime+".log"
Function LogWrite
{
Param ([string]$logstring)
Add-content $Logfile -value $logstring
}

# Remove machines in a collection from SCCM
ForEach ($CollectionName In $list)
{
Foreach ($CMDevice in $CollectionName)
{
$CMDevice = Get-CMDevice -CollectionName $CollectionName | Format-List -Property Name | Out-String
Get-CMDevice -CollectionName $CollectionName | Remove-CMDevice -Force -ErrorAction SilentlyContinue
Get-CMDeviceCollection -Name $CollectionName | Invoke-CMCollectionUpdate -ErrorAction SilentlyContinue
LogWrite "$LogTime | $CollectionName | $CMDevice"
}
}

 

Expected Log File Output:

 

Conclusion

System Center Configuration Manager is a product that has a broad offering of features. Administrators at times can be overwhelmed by operational activities and overlook monitoring of maintenance collections or performing maintenance tasks. Automating of tasks to run on a regular schedule can provide consistency in maintaining the health of your environment. The above script can be configured to run using Task Scheduler.

 

Hope the shared script is helpful in maintaining your environment.

 

Disclaimer – All scripts and reports are provided ‘AS IS’
This sample script is not supported under any Microsoft standard support program or service. This sample script is provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of this sample script and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of this script be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use this sample script or documentation, even if Microsoft has been advised of the possibility of such damages.

Publish Your Home Internet Connection IP Address Using Azure Blobs- Part 1

0
0

Scenario

Ever wanted to be able to access your home server while your on the go, but you don't know its IP Address ? Or maybe it changed ?

Background

I have a server at home which I use to host number of virtual machines that I use for performing some labs or tests. I have a VM on that server that is published on the internet through my home internet connection. Since I travel a lot, I find it sometimes challenging to connect to this VM and hence to my home server to work remotely.

On any usual day, I usually open Bing and just type What is my IP and Bing just write it back to me or gives me number of websites that can tell what is my public IP address, I record that IP somewhere and connect to it whenever I want to play with my home server. But from time to time, the IP address changes for various reasons, and I need to check again what is the new IP address or ask somebody at home to try to figure it out (and I spend considerable amount of time attempting to explain how to do that, and that is IP and what are those numbers 😃), so for the soul purpose of saving myself all this hassle, I went looking for any free solution that can help me figure out my home connection IP address while I'm not connected to my home network.

In my quest, I managed to come with an idea that is not so bad which allows me to easily find my home IP address, the only requirement for this solution is having an Azure Subscription, and the Azure Powershell module

In this 2 parts blog, I will be discussing the details of this solution

The Solution

The basic idea of the solution is to utilize the Azure Blob Storage to write a file that contains my home connection IP address from a VM or a PC running at home, and since the file is writing to a Blob in my Azure subscription, I can access it from anywhere on the planet

There are mainly to parts to this solution:

  1. Finding my home connection IP address
  2. Writing the IP address to a file and uploading it to an Azure Blob

In this blog post, I will be discussing the first part, while in the next blog article, I will be discussing the second part.

Part 1 - Finding my IP Address

So how can I figure my IP address in an automated way? remember, I'm not at home, so something needs to find it for me, and sure it's not my wife or kids.

Powershell to the rescue

While I was searching for how to figure out my IP address using Powershell, I found a clever script written by Aman Dhally (http://www.amandhally.net/2012/10/04/powershell-script-to-get-static-ip-address-or-public-ip-address-of-internet-connection/).

The Idea is simple, using Powershell, I would create a simple request to any of the public websites that can figure out my IP address, In this case http://checkip.dyndns.com and parse the response.

Browsing manually to the following URL http://checkip.dyndns.org in your favorite browser gets you the following page

As you can see, it's a simple page with your IP address written to it. All I need to do is to grab this IP and write it to a file, simple 😃

Now I made some modifications to Aman's original script, and the one I'm using is as follow:

function GetMyIP(){
    $DDNSIPCheckerUrl = "http://checkip.dyndns.com"
    $WebClient = New-Object System.Net.Webclient
    $DDNSResponse = $WebClient.DownloadString($DDNSIPCheckerUrl)
    
    $Regex = [regex] "\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}"
    $RegExResult = $Regex.Match($DDNSResponse)

    If($RegExResult.Success){
        $IPAddress = $DDNSResponse.Substring($RegExResult.Index,$RegExResult.Length)
        return $IPAddress     
    }
    else{
        return 0
    }
}

To test this function, copy and paste the code above into Powershell, then type GetMyIP and hit enter.

So what I'm (more accurately, Aman) doing is that I'm creating a WebClient object to send an HTTP get request (as if I'm the browser) and storing the reply in a Powershell variable $DDNSResponse, then I'm using Regular Expressions to search the web server response for a string that looks like an IP address (xxx.xxx.xxx.xxx), extract that IP then return it, and just in case the response does not contain any IP address, return 0.

Give it a try at home, or work, or any place 😃
See you in the part 2

Disclaimer – All scripts and reports are provided ‘AS IS’

This sample script is not supported under any Microsoft standard support program or service. This sample script is provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of this sample script and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of this script be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use this sample script or documentation, even if Microsoft has been advised of the possibility of such damages.

Publish Your Home Internet Connection IP Address Using Azure Blobs- Part 2

0
0
This is part 2 of a hack to publish your Home Internet Connection IP Address to Azure. Please check part 1 of the article here
https://blogs.technet.microsoft.com/meamcs/2018/12/23/publish-your-home-internet-connection-ip-address-using-azure-blobs-part-1/

Scenario

Ever wanted to be able to access your home server while your on the go, but you don't know its IP Address ? Or maybe it changed ?

Background

I have a server at home which I use to host number of virtual machines that I use for performing some labs or tests. I have a VM on that server that is published on the internet through my home internet connection. Since I travel a lot, I find it sometimes challenging to connect to this VM and hence to my home server to work remotely.

On any usual day, I usually open Bing and just type What is my IP and Bing just write it back to me or gives me number of websites that can tell what is my public IP address, I record that IP somewhere and connect to it whenever I want to play with my home server. But from time to time, the IP address changes for various reasons, and I need to check again what is the new IP address or ask somebody at home to try to figure it out (and I spend considerable amount of time attempting to explain how to do that, and that is IP and what are those numbers 😃), so for the soul purpose of saving myself all this hassle, I went looking for any free solution that can help me figure out my home connection IP address while I'm not connected to my home network.

In my quest, I managed to come with an idea that is not so bad which allows me to easily find my home IP address, the only requirement for this solution is having an Azure Subscription, and the Azure Powershell module

In this 2 parts blog, I will be discussing the details of this solution

The Solution

The basic idea of the solution is to utilize the Azure Blob Storage to write a file that contains my home connection IP address from a VM or a PC running at home, and since the file is writing to a Blob in my Azure subscription, I can access it from anywhere on the planet

There are mainly to parts to this solution:

  1. Finding my home connection IP address
  2. Writing the IP address to a file and uploading it to an Azure Blob

In this blog post, I will be discussing the second part.

Part 2 - Writing the IP address to a file and uploading it to an Azure Blob

So now that I have the IP Address of my home internet connection, it's time to save it to a file. This is extremely easier than you can imagine. Remember from the first part of this blog post, we created a function that retrieves the IP address that looks like the following

function GetMyIP(){
   some code…
}

Now it's time to utilize it as follow

$MyIP = GetMyIP

Then write the IP Address to a file

$MyIP | Out-File C:\Temp\MyIP.txt -Force   # Use the -Force switch to over write any existing file if necessary

So now that we have the file with the IP address, it's time to upload it to Azure. To achieve that, we need to perform number of steps in preparation for uploading the file

  1. Create a Blob container in a Storage Account
  2. Obtain container Shared Access Key
  3. Install the Azure Powershell module
  4. Write Blob into container

So if you already have an existing Storage Account (and in turn a Resource Group) ready for this, you can utilize it, otherwise, you'll need to create a new one. A quick guide on creating a storage account can be read here (https://docs.microsoft.com/en-us/azure/storage/common/storage-quickstart-create-account?tabs=portal)

Once the storage account is created, you'll need to create a Blob container to host the uploaded file. Simply log on to the Azure Portal, browse to the storage account that you created, then click on Blob in the overview page

Then click on +Container

Give it a name and make sure you choose Private under the Public access level (you don't want the public to know your IP address)

Now you have the container created successfully

Now back to the myipsa storage account overview page, click on Access Keys

Once in the Access Keys page, copy the Key under Key1 and paste it somewhere (Notepad will be a good choice)

The third step is to install the Azure Powershell module on them VM or PC that will be scheduled to run this script
The simplest way to achieve this is by executing the following from an elevated Powershell window

Install-Module Azure

You might get a prompt asking for your permission to install the NuGet provider as follow

If you are okay (which you need to be) , type Y for Yes and press Enter (Or Just press Enter without typing anything) which will give you another prompt asking for your confirmation to install the Azure Powershell module from a public repository, then the installation will start

Now back to our script from part 1, add the following lines to the previous script

$SAAccessKey = "<access key1>" #Paste key1 here
$SAName = "myipsa"
$ContainerName = "myip"
$LocalFile = "C:\Temp\MyIP.txt"

In the above lines, we are just preparing things, essentially saving the Access Key to a variable along with the storage account name and the container name, and the path to a file on your VM or PC that we will upload to the Blob container.

Now it's time to utilize the function we created in part1, use it to obtain the IP address into a variable

$MyIPAddr = GetMyIP
$MyIPAddr | Out-File $LocalFile -Force

An better one-liner approach is as follow

GetMyIP | Out-File $LocalFile -Force

Note that I'm using the -Force option as I don't want the script to ask me if I need to over right an existing file or not, just go ahead and overwrite it

Now we create a storage context defining the path in the storage container and how to access it, this is also a one-liner

$saContext = New-AzureStorageContext -StorageAccountName $SAName -StorageAccountKey $SAAccessKey

And finally, to upload the created file which contains the IP address

Set-AzureStorageBlobContent -File $LocalFile -Container $ContainerName -Blob "MyIP.txt" -Context $saContext -Force

So now the whole script (in a simple form)

<# 
 .SYNOPSIS 
    Captures internet connection public IP address and write it to a file on Azure Blob. 
 .DESCRIPTION 
    The purpose of this script is to help anybody with detecting the public IP address of an internet connection and 
    write it to a local file, then upload that file into an Azure Blob storage account
    Based on script by Aman Dhally (http://www.amandhally.net/2012/10/04/powershell-script-to-get-static-ip-address-or-public-ip-address-of-internet-connection/)
 .NOTES 
    Author  : Muhammad Raslan 
    Requires: Azure PowerShell Module
 .LINK 
    WebClient Class
        https://docs.microsoft.com/en-us/dotnet/api/system.net.webclient?view=netframework-4.7.2
    Regular Expression Language
        https://docs.microsoft.com/en-us/dotnet/standard/base-types/regular-expression-language-quick-reference
    New-AzureStorageContext
        https://docs.microsoft.com/en-us/powershell/module/azure.storage/new-azurestoragecontext?view=azurermps-6.13.0
    Set-AzureStorageBlobContent
        https://docs.microsoft.com/en-us/powershell/module/azure.storage/set-azurestorageblobcontent?view=azurermps-6.13.0
#> 
 

function GetMyIP(){
    $DDNSIPCheckerUrl = "http://checkip.dyndns.com"
    $WebClient = New-Object System.Net.Webclient
    $DDNSResponse = $WebClient.DownloadString($DDNSIPCheckerUrl)
    
    $Regex = [regex] "\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}"
    $RegExResult = $Regex.Match($DDNSResponse)

    If($RegExResult.Success){
        $IPAddress = $DDNSResponse.Substring($RegExResult.Index,$RegExResult.Length)
        return $IPAddress     
    }
    else{
        return 0
    }
}

$SAAccessKey = "<replace with your access key>" #Write your Access Key here
$SAName = "myipsa" #Write your Storage Account name here
$ContainerName = "myip" #Write your container name here
$LocalFile = "C:\Temp\MyIP.txt" #Write the full path to the temporary text file here

GetMyIP | Out-File $LocalFile -Force

$saContext = New-AzureStorageContext -StorageAccountName $SAName -StorageAccountKey $SAAccessKey
Set-AzureStorageBlobContent -File $LocalFile -Container $ContainerName -Blob "MyIP.txt" -Context $saContext -Force

One cool idea is writing to an Azure Table instead of writing to a Blob, or you might have a different idea to utilize this.

This idea can also be extended to several different scenarios. For example you can gatehr on a period basis some information about servers in your on-premises environment and write them into Azure for further analysis. Also storing log files, generating configurations and uploading / downloading them, the possibilities with Azure are limitless, you can do whatever you want.

 

Till the next blog...

Cheers 😃

 

Disclaimer – All scripts and reports are provided ‘AS IS’

This sample script is not supported under any Microsoft standard support program or service. This sample script is provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of this sample script and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of this script be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use this sample script or documentation, even if Microsoft has been advised of the possibility of such damages.

Step by step MIM PAM setup and evaluation Guide – Part 1

0
0

 

Background:

Privileged Access Management (PAM) is relatively new feature of Microsoft Identity Manager 2016 and is becoming more and more popular. Intention of this Blog series is to provide step by step instructions how to deploy PAM right way and how to evaluate its features.

In this series I will use Azure VMs to simulate real world environment. You can opt for different setup (Hyper-V, VMWare or even physical servers).

Scenario

Contoso is company which has existing AD infrastructure with Forest prod.contoso.com running Windows 2012 R2 Domain Controllers in Windows 2012 R2 Domain and Forest Functional Level. They want to implement Just in Time access control for users of their two critical applications: TestApp and ClaimApp. They also want to restrict access to their DomainAdmins using JIT and MFA authentication.

After the research, they want to evaluate MIM PAM as a solution for those requirements.

series:

  • Part 1 – Preparing test environment
  • Part 2 – PAM prerequisites
  • Part 3 – Installing PAM Server
  • Part 4 – Installing PAM Example portal
  • Part 5 – MFA configuration
  • Part 6 – Evaluation

Test Environment description:

To evaluate PAM, we need two AD forests – production and privileged forest. In my test environment I will set-up only one Domain Controller per AD forest, but in the production, you should have at least two. In addition to Domain Controllers we will need in the production forest Exchange Server and Client machine. In privileged forest we will need MIM server and, optionally, additional client machine.

My setup lab contains following:

prod.contoso.com Forest

PROD-DC running Windows 2012 R2 - Standard DS1 v2 (1 vcpus, 3.5 GB memory)

(I’ve selected Windows 2012R2 to simulate most common situation in the field)

  • Active Directory
  • Certificate Services
  • ADFS services
  • MFA server

PROD-EX running Windows 2016 - Standard E2s v3 (2 vcpus, 16 GB memory)

  • Exchange 2016
  • Windows Authentication sample application
  • Claims Authentication sample application

PROD-CL running Windows 10 - Standard DS1 v2 (1 vcpus, 3.5 GB memory)

priv.contoso.com Forest

PRIV-DC running Windows 2016 - Standard DS1 v2 (1 vcpus, 3.5 GB memory)

(even we can use Windows 2012R2, Windows 2016 provides many advantages)

  • Active Directory

PRIV-PAM running Windows 2016 - Standard B8ms (8 vcpus, 32 GB memory)

  • SQL server 2016
  • SharePoint Server 2016
  • MIM Service and Portal
  • PAM

PRIV-CL running Windows 10 - Standard DS1 v2 (1 vcpus, 3.5 GB memory)

PROD and PRIV forests are installed in different Azure Resource Groups and between them is configured routing.

Preparing Test Environment:

    1. Install OS on all machines (see above for OS version);
    2. Promote Domain Controllers:
      1. PROD

Log on to PROD-DC as an Administrator

In Admin PowerShell run following commands:

Install-WindowsFeature AD-Domain-Services -IncludeAllSubFeature –IncludeManagementTools

Import-Module ADDSDeployment

Install-ADDSForest -CreateDnsDelegation:$false -DatabasePath "C:\Windows\NTDS" -DomainMode "Win2012R2" -DomainName "prod.contoso.com" -DomainNetbiosName "PROD" -ForestMode "Win2012R2" -InstallDns:$true -LogPath "C:\Windows\NTDS" -NoRebootOnCompletion:$false -SysvolPath "C:\Windows\SYSVOL" -Force:$true

      1. PRIV

Log on to PRIV-DC as an Administrator

On PRIV-DC in Admin PowerShell run following commands:

Install-WindowsFeature AD-Domain-Services -IncludeAllSubFeature –IncludeManagementTools

Import-Module ADDSDeployment

Install-ADDSForest -CreateDnsDelegation:$false -DatabasePath "C:\Windows\NTDS" -DomainMode "WinThreshold" -DomainName "priv.contoso.com" -DomainNetbiosName "PRIV" -ForestMode "WinThreshold" -InstallDns:$true -LogPath "C:\Windows\NTDS" -NoRebootOnCompletion:$false -SysvolPath "C:\Windows\SYSVOL" -Force:$true

      1. Configure Dns

    Log on to PROD-DC as an Administrator

        1. On PROD-DC create top Dns zone (contoso.com)

    Open Powershell as Admin and run following command

    Add-DnsServerPrimaryZone -Name "contoso.com" -ReplicationScope "Forest" -PassThru -DynamicUpdate None

        1. Allow zone transfer to priv-dc:

    Set-DnsServerPrimaryZone -Name "contoso.com" -SecureSecondaries TransferToSecureServers -SecondaryServers "<priv-dc-ip>"

        1. Create A records in created top zone for applications

     

    Record IP Description
    sts.contoso.com Prod DC IP ADFS service
    claimapp.contoso.com Prod Exchange IP Application with Claim based authentication
    testapp.contoso.com Prod Exchange IP Application with Windows Integrated authentication
    mail.contoso.com Prod Exchange IP Mail server CAS
    mfasdk.contoso.com Prod DC IP MFA SDK service
    pam.contoso.com Priv MIM IP PAM user portal
    pamapi.contoso.com Priv MIM IP PAM API
    pamportal.contoso.com Priv MIM IP PAM Administrative portal
    pamsvc.contoso.com Priv MIM IP PAM Web Service endpoint

     

    Add-DnsServerResourceRecordA -Name "sts" -ZoneName "contoso.com" -IPv4Address "<prod-dc-ip>"

    Add-DnsServerResourceRecordA -Name "claimapp" -ZoneName "contoso.com" -IPv4Address "<prod-ex-ip>"

    Add-DnsServerResourceRecordA -Name "testapp" -ZoneName "contoso.com" -IPv4Address "<prod-ex-ip>"

    Add-DnsServerResourceRecordA -Name "mail" -ZoneName "contoso.com" -IPv4Address "<prod-ex-ip>"

    Add-DnsServerResourceRecordA -Name "mfasdk" -ZoneName "contoso.com" -IPv4Address "<prod-dc-ip>"

    Add-DnsServerResourceRecordA -Name "pam" -ZoneName "contoso.com" -IPv4Address "<priv-mim-ip>"

    Add-DnsServerResourceRecordA -Name "pamapi" -ZoneName "contoso.com" -IPv4Address "<priv-mim-ip>"

    Add-DnsServerResourceRecordA -Name "pamportal" -ZoneName "contoso.com" -IPv4Address "<priv-mim-ip>"

    Add-DnsServerResourceRecordA -Name "pamsvc" -ZoneName "contoso.com" -IPv4Address "<priv-mim-ip>"

        1. Create Prod DNS Zone Delegation on Prod-DC

    Add-DnsServerZoneDelegation -Name "contoso.com" -ChildZoneName "prod" -NameServer "prod-dc.contoso.com" -IPAddress "<prod-dc-ip>" -PassThru

        1. Create priv DNS Zone Delegation on Prod-DC

    Add-DnsServerZoneDelegation -Name "contoso.com" -ChildZoneName "priv" -NameServer "priv-dc.contoso.com" -IPAddress "<priv-dc-ip>" -PassThru

    Log on to PRIV-DC as an Administrator

        1. On PRIV-DC create copy of top domain:

    Add-DnsServerSecondaryZone -Name "contoso.com" -MasterServers "<prod-dc-ip>" -ZoneFile "contoso.com.dns" -PassThru

        1. Join all member servers and clients to domains
          1. Join PRIV-PAM and PRIV-CL To PRIV Domain
          2. Join PROD-EX and PROD-CL To PROD Domain
        2. On PROD domain create TestAppUsers Global Security Group, TestAppUser and add TestAppUser to TestAppUsers Group

      Log on to PROD-DC as an Administrator

      Open Powershell as Admin and run following commands

          1. Execute following PowerShell commands to create OUs:

      New-ADOrganizationalUnit -Name Corp -Path "DC=prod,DC=contoso,DC=com" -ProtectedFromAccidentalDeletion $true

      New-ADOrganizationalUnit -Name Application -Path "OU=Corp,DC=prod,DC=contoso,DC=com" -ProtectedFromAccidentalDeletion $true

      New-ADOrganizationalUnit -Name Administration -Path "OU=Corp,DC=prod,DC=contoso,DC=com" -ProtectedFromAccidentalDeletion $true

      New-ADOrganizationalUnit -Name SystemAccounts -Path "DC=prod,DC=contoso,DC=com" -ProtectedFromAccidentalDeletion $true

          1. Create Group:

      New-ADGroup -Name TestAppUsers -GroupCategory Security -GroupScope Global -Path "OU=Application,OU=Corp,DC=prod,DC=contoso,DC=com"

          1. Create User:

      $secPwd = ConvertTo-SecureString 'P@$$w0rd' -asplaintext –force

      New-ADUser -Name TestAppUser -DisplayName "Test Application User" -Enabled $true -Path "OU=Application,OU=Corp,DC=prod,DC=contoso,DC=com" -SamAccountName TestAppUser -AccountPassword $secPwd

          1. Add user to group

      Add-ADGroupMember -Identity "CN=TestAppUsers,OU=Application,OU=Corp,DC=prod,DC=contoso,DC=com" -Members TestAppUser

          1. On Prod-Dc server install Certificate Services
            1. From PowerShell As an Admin execute following commands:

        Add-WindowsFeature ADCS-Cert-Authority -IncludeManagementTools

        Install-AdcsCertificationAuthority -CAType EnterpriseRootCA –Force

            1. On Prod Server configure http CRL distribution point:
              1. Install and configure IIS for CRL distribution

          From PowerShell run

          Add-WindowsFeature Web-WebServer –IncludeManagementTools

          New-WebVirtualDirectory -Site "Default Web Site" -Name CertEnroll –PhysicalPath C:\Windows\System32\CertSrv\CertEnroll

              1. Configure and publish Certificate Authority Templates

          This operation requires much more complex scripting so we will do it using GUI

          Open Certificate Authority tool, expand CA server, right click on Certificate Templates container and select "Manage"

          image[3][6]

          Find and select "Code Signing" template, right click and select "Duplicate Template"

          image[6][10]

          In "Properties of New Template" window select General Tab and enter Template Display Name "Code Signing V2"

          image[17]

          On the Tab "Superseded Templates" add old "Code Signing" template

          image[20]

          Click "OK"

          Select, and right click "Web Server" template and select "Duplicate Template"

          In "Properties of New Template" window select General Tab and enter Template Display Name "Web Server V2"

          image[23]

          On the "Request Handling" tab check "Allow private key to be exported"

          image[26]

          On the Tab "Superseded Templates" add old "Web Server" template

          image[29]

          On the "Security" Tab add Read and Enroll permissions to "Domain Computers" and "Domain Controllers" groups.

          image[32]

          Click "OK"

          Close Certificate Templates Console.

          In Certificate Authority tool select "Certificate Templates" container

          Right click on the empty space in the central pane and select "Certificate Template to issue"

          image[35]

          Select (using Ctrl key) both new templates (Web Server V2 and Code Signing V2) and press OK

          image[38]

          Close Certificate Authority Tool

              1. Add CA certificate to "Trusted root Certificate Authorities" in PRIV forest
                1. Export CA root certificate

          On command prompt on PROD-DC position in the folder where you want to save certificate and run following command

          certutil -ca.cert PROD-DC.cer

                1. Distribute PROD CA root certificate to Client machines using GPO

          Log on to PRIV-DC as an Administrator

          on PRIV-DC open Group Policy Management Console, find Default Domain Policy and open it for edit

          open Computer Configuration\Policies\Windows Settings\Security Settings\Public Key Policies, right-click Trusted Root Certification Authorities, and then click Import. Select file created in previous step (PROD-DC.cer) and follow the wizard (accept all default).

              1. On PROD-EX install Exchange Server 2016.

            I won’t spend much time explaining how to install Exchange 2016, but since we are installing it on Windows 2016 server you should use latest binaries. I was using binaries found on https://www.microsoft.com/en-us/download/details.aspx?id=57068

              1. Configure Exchange Server OWA and PowerShell for SSL

            Log on to PROD-EX as a PROD\Administrator

                1. Re-apply Group Policies

            gpupdate /force

                1. Request Certificate

            On PROD-EX run from PowerShell (Admin) following command:

            $sslCertificate = Get-Certificate -Template WebServerV2 -SubjectName "CN=Prod-Ex.prod.contoso.com, OU=Blog,O=Contoso,L=Dubai,S=Dubai,C=AE" -DnsName mail.contoso.com -CertStoreLocation cert:\LocalMachine\My

                1. Reconfigure SSL

            Run following commands in admin PowerShell on PROD-EX server:

            $cert = $sslCertificate.Certificate.Thumbprint

            $guid = [guid]::NewGuid().ToString("B")

            netsh http add sslcert hostnameport="mail.contoso.com:443" certhash=$cert certstorename=MY appid="$guid"

            New-WebBinding -name "Default Web Site" -Protocol https -HostHeader "mail.contoso.com" -Port 443 -SslFlags 1

                1. Install ADFS

              Log on to PROD-DC as an Administrator

                  1. Initialize gMSA

              On PROD-DC run PowerShell (Admin) command

              Add-KdsRootKey –EffectiveTime ((get-date).addhours(-10))

              New-ADGroup -Name 'Grp-gMSA' -GroupScope Global -Description 'This Group contains Principals allowed to retrieve Managed Password'

                  1. Create Service Account

              New-ADServiceAccount -name adm-ADFSService -DNSHostName 'prod-dc.prod. contoso.com ' -PrincipalsAllowedToRetrieveManagedPassword 'Grp-gMSA' -Description 'Account Running ADFS service'

                  1. Install

              On PROD-DC run PowerShell (Admin) command

              Install-WindowsFeature ADFS-Federation –IncludeManagementTools

                  1. Create ADFS SSL certificate

              On PROD-DC run from PowerShell (Admin) following command:

              $adfsCert = Get-Certificate -Template WebServerV2 -SubjectName "CN=sts.contoso.com, OU=Blog,O=Contoso,L=Dubai,S=Dubai,C=AE" -CertStoreLocation cert:\LocalMachine\My

                  1. Setup ADFS

              On PROD-DC run from PowerShell (Admin) following commands:

              $cert = $adfsCert.Certificate.Thumbprint

              Install-AdfsFarm -CertificateThumbprint $cert -FederationServiceName sts.contoso.com -FederationServiceDisplayName "Contoso Corporation" -GroupServiceAccountIdentifier PROD\adm-ADFSService$ -OverwriteConfiguration

                  1. Install sample Applications
                    1. Download sample Windows Authentication Application from https://github.com/gmihelcic/TestWindowsAuthenticationApp/raw/master/TestWindowsAuthentication.zip
                    2. Create new folder C:\Applications\WindowsAuth on PROD-EX and extract downloaded ZIP to that folder
                    3. Download sample Claim Authentication Application from https://github.com/gmihelcic/TestWindowsAuthenticationApp/raw/master/ClaimApp.zip
                    4. Create new folder C:\Applications\ClaimsAuth on PROD-EX and extract downloaded ZIP to that folder

                Log on to PROD-EX as a PROD\Administrator

                    1. Create Application Pools:

                Import-Module WebAdministration

                New-Item –Path IIS:\AppPools\WindowsAuthApp

                $AppPool = Get-Item IIS:\AppPools\WindowsAuthApp

                $AppPool.processModel.identityType = "NetworkService"

                $AppPool| set-item

                New-Item –Path IIS:\AppPools\ClaimsAuthApp

                $AppPool = Get-Item IIS:\AppPools\ClaimsAuthApp

                $AppPool.processModel.identityType = "NetworkService"

                $AppPool| set-item

                    1. Create Web Applications

                $sslCert = Get-Certificate -Template WebServerV2 -SubjectName "CN=testapp.contoso.com, OU=Blog,O=Contoso,L=Dubai,S=Dubai,C=AE" -CertStoreLocation cert:\LocalMachine\My

                $cert = $sslCert.Certificate.Thumbprint

                $guid = [guid]::NewGuid().ToString("B")

                netsh http add sslcert hostnameport="testapp.contoso.com:443" certhash=$cert certstorename=MY appid="$guid"

                New-WebSite -Name "testapp.contoso.com" -Ssl -Port 443 -HostHeader "testapp.contoso.com" -PhysicalPath "C:\Applications\WindowsAuth" -ApplicationPool "WindowsAuthApp" -SslFlags 1

                $sslCert = Get-Certificate -Template WebServerV2 -SubjectName "CN=claimapp.contoso.com, OU=Blog,O=Contoso,L=Dubai,S=Dubai,C=AE" -CertStoreLocation cert:\LocalMachine\My

                $cert = $sslCert.Certificate.Thumbprint

                $guid = [guid]::NewGuid().ToString("B")

                netsh http add sslcert hostnameport="claimapp.contoso.com:443" certhash=$cert certstorename=MY appid="$guid"

                New-WebSite -Name "claimapp.contoso.com" -Ssl -Port 443 -HostHeader "claimapp.contoso.com" -PhysicalPath "C:\Applications\ClaimsAuth" -ApplicationPool "WindowsAuthApp" -SslFlags 1

                    1. Configure Windows Authentication for testapp.contoso.com:

                Set-WebConfigurationProperty -filter /system.WebServer/security/authentication/AnonymousAuthentication -name enabled -value false -location testapp.contoso.com

                Set-WebConfigurationProperty -filter /system.WebServer/security/authentication/windowsAuthentication -name enabled -value true -location testapp.contoso.com

                setspn -S http/testapp.contoso.com PROD-EX

                IISRESET

                    1. Register application in ADFS

                  Log on to PROD-DC as an Administrator

                  Run following PowerShell commands with Administrative privileges:

                      1. Add ADFS Relying Party Trust for ClaimApp

                  Add-AdfsRelyingPartyTrust -Name "Test Application" -WSFedEndpoint 'https://claimapp.contoso.com' -Identifier 'https://claimapp.contoso.com' -Enabled $true

                      1. Configure Relying Party Claim Issuance Rules

                  $rules = @'

                  @RuleName = "Roles"

                  c:[Type == "http://schemas.microsoft.com/ws/2008/06/identity/claims/groupsid"]

                  => issue(Type = "http://schemas.microsoft.com/ws/2008/06/identity/claims/role", Issuer = c.Issuer, OriginalIssuer = c.OriginalIssuer, Value = c.Value, ValueType = c.ValueType);

                  @RuleName = "User Name"

                  c:[Type == "http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname", Issuer == "AD AUTHORITY"]

                  => issue(store = "Active Directory", types = ("http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name"), query = ";userPrincipalName;{0}", param = c.Value);

                  '@

                  Set-ADFSRelyingPartyTrust –TargetName "Test Application" -IssuanceTransformRules $rules

                      1. Configure Relying Party Authorization Rule

                  $authRules = '=> issue(Type = "http://schemas.microsoft.com/authorization/claims/permit", Value = "true");'

                  $rSet = New-ADFSClaimRuleSet –ClaimRule $authRules

                  Set-ADFSRelyingPartyTrust –TargetName "Test Application" –IssuanceAuthorizationRules $rSet.ClaimRulesString

                      1. Configure Local Intranet Internet Zone
                        1. Open Group Policy Management tool
                        2. Expand Forest/Domains/prod.contoso.com/Group Policy Objects and right click on "Default Domain Policy". Select Edit

                    image

                        1. Expand Computer Configuration/Policies/Administrative Templates/Windows Components/Internet Explorer/Internet Control Panel/Security Page and double click on "Site To Zone Assignment List" policy

                    image

                        1. Click on Enable and button Show

                    Configure and

                          • https://sts.contoso.com,
                          • https://testapp.contoso.com,
                          • https://claimapp.contoso.com,
                          • https://mfasdk.contoso.com and
                          • https://pamapi.contoso.com
                          • https://pamportal.contoso.com
                          • https://pam.contoso.com

                    for "Local Intranet" Zone (1):

                    image

                    Click OK, OK and close GPMC tool

                        1. Repeat above steps (a – d) for priv.contoso.com domain.

                    Log on to PROD-CL as a PROD\Administrator

                        1. Open Administrative Command Prompt and execute

                    Gpupdate /force

                        1. Open System tool from Control Panel and configure Domain Users for Remote access

                    image

                        1. Test Applications

                        Log on to PROD-CL as a PROD\TestAppUser with password P@$$w0rd

                            1. Open Internet Explorer and go to page https://testapp.contoso.com

                        You should get

                        image

                            1. Open https://claimapp.contoso.com

                        You should get following:

                        image

                        Conclusion of Part 1

                        Now we are ready for the Part 2 - PAM prerequisites setup.

                        In this exercise we have set up environment with several components. This environment will be good basis for next exercises.

                        In this exercise I didn’t spend much time on PRIV Forest hardening, what I leave to you for the future.

                        In the Part 2 we will set up bunch of accounts and Groups, harden PAM server, setup SQL and SharePoint 2016 for PAM. Until then

                        Have a great week.

                         

                        Disclaimer – All scripts and reports are provided ‘AS IS’

                        This sample script is not supported under any Microsoft standard support program or service. This sample script is provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of this sample script and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of this script be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use this sample script or documentation, even if Microsoft has been advised of the possibility of such damages.

                        Step by step MIM PAM setup and evaluation Guide – Part 2

                        0
                        0

                        This is second part of the series. In the previous post we have prepared test environment for PAM deployment. Now we have two forests – prod.contoso.com and priv.contoso.com. In PROD we have set up Certificate services, Exchange server, ADFS services and configured two test applications – one is using Windows Integrated Authentication and the second Claim based Authentication.

                        Series:

                        • Part 1 – Preparing test environment
                        • Part 2 – PAM prerequisites
                        • Part 3 – Installing PAM Server
                        • Part 4 – Installing PAM Example portal
                        • Part 5 – MFA configuration
                        • Part 6 – Evaluation

                        Installing PAM prerequisites

                          1. Preparing PROD Forest

                        Log on to PROD-DC as an Administrator

                            1. Create OU for Exchange Linked Mailbox accounts

                        New-ADOrganizationalUnit -Name 'Linked accounts' -Path "DC=prod,DC=contoso,DC=com" -ProtectedFromAccidentalDeletion $true

                            1. Create Auditing Group

                        In PowerShell execute following commands

                        New-ADGroup -Name "PROD`$`$`$" -Path "OU=SystemAccounts,DC=prod,DC=contoso,DC=com" -GroupScope DomainLocal -GroupCategory Security -Description "Supports PAM Auditing"

                            1. Configure Audit policies

                        Open Admin Command Prompt and execute following command:

                        Auditpol.exe /Set /Category:"Account Management","DS Access" /Failure:Enable /Success:Enable

                        gpupdate /force /target:Computer

                            1. Configure registry settings for SID History migration

                        New-ItemProperty –Path HKLM:SYSTEM\CurrentControlSet\Control\Lsa –Name TcpipClientSupport –PropertyType DWORD –Value 1

                        Restart-Computer

                            1. Preparing PRIV Forest:

                          Log on to PRIV-DC as an Administrator

                              1. Create Organizational Units

                          New-ADOrganizationalUnit -Name 'Service Identities' -Path "DC=priv,DC=contoso,DC=com" -ProtectedFromAccidentalDeletion $true

                          New-ADOrganizationalUnit -Name 'Service accounts' -Path "OU=Service Identities,DC=priv,DC=contoso,DC=com" -ProtectedFromAccidentalDeletion $true

                          New-ADOrganizationalUnit -Name 'PAM Objects' -Path "DC=priv,DC=contoso,DC=com" -ProtectedFromAccidentalDeletion $true

                          New-ADOrganizationalUnit -Name 'Service groups' -Path "OU=Service Identities,DC=priv,DC=contoso,DC=com" -ProtectedFromAccidentalDeletion $true

                              1. Prepare Domain for gMSA

                          Add-KdsRootKey –EffectiveTime ((get-date).addhours(-10))

                          New-ADGroup -Name 'Grp-gMSA' -Path "OU=Service groups,OU=Service Identities,DC=priv,DC=contoso,DC=com" -GroupScope Global -Description 'This Group contains Principals allowed to retrieve Managed Password'

                              1. Create Accounts
                          Username Description
                          PAMAdmin Used to install and Administer MIM. Needs Logon Locally and access over the network
                          svc_PAMAppPool This Domain Account will run the SharePoint App Pool for PAM Portal (needs logon as a batch job)
                          svc_PAMFarmWSS This Domain Account will run WSS farm
                          svc_PAMWs This Domain Account runs MIM Service. If you are running Exchange Server 2007 or later give this account a mailbox. In any event make it mail enabled
                          svc_PAMMonitor This Domain Account runs PAM Monitor Service
                          svc_PAMComponent This Domain Account runs PAM Component Service
                          svc_MIMMA Needed for MIM Portal installation – NOT IN USE

                           

                          $svcAccounts = "OU=Service accounts,OU=Service Identities,DC=priv,DC=contoso,DC=com"

                          $secPwd = ConvertTo-SecureString 'P@$$w0rd' -asplaintext –force

                          New-ADUser -Name PAMAdmin -DisplayName "PAM Administrator" -Enabled $true -Path $svcAccounts -SamAccountName PAMAdmin -AccountPassword $secPwd -UserPrincipalName "PAMAdmin@priv.contoso.com" -Description "Used to install and Administer MIM. Needs Logon Locally and access over the network"

                          New-ADUser -Name svc_PAMAppPool -DisplayName "PAM AppPool" -Enabled $true -Path $svcAccounts -SamAccountName svc_PAMAppPool -AccountPassword $secPwd -UserPrincipalName "svc_PAMAppPool@priv.contoso.com" -Description "This Domain Account will run the SharePoint App Pool for PAM Portal (needs logon as a batch job)"

                          New-ADUser -Name svc_PAMFarmWSS -DisplayName "PAM FarmWSS" -Enabled $true -Path $svcAccounts -SamAccountName svc_PAMFarmWSS -AccountPassword $secPwd -UserPrincipalName "svc_PAMFarmWSS@priv.contoso.com" -Description "This Domain Account will run WSS farm."

                          New-ADUser -Name svc_PAMWs -DisplayName "PAM Service" -Enabled $true -Path $svcAccounts -SamAccountName svc_PAMWs -AccountPassword $secPwd -UserPrincipalName "svc_PAMWs@priv.contoso.com" -Description "This Domain Account runs MIM Service (Put this into MIMSyncAdmins and MIMSyncPasswordSet groups). If you are running Exchange Server 2007 or later give this account a mailbox. In any event make it mail enabled"

                          New-ADUser -Name svc_PAMMonitor -DisplayName "PAM Monitor" -Enabled $true -Path $svcAccounts -SamAccountName svc_PAMMonitor -AccountPassword $secPwd -UserPrincipalName "svc_PAMMonitor@priv.contoso.com" -Description "This Domain Account runs PAM Monitor Service"

                          New-ADUser -Name svc_PAMComponent -DisplayName "PAM Component" -Enabled $true -Path $svcAccounts -SamAccountName svc_PAMComponent -AccountPassword $secPwd -UserPrincipalName "svc_PAMComponent@priv.contoso.com" -Description "This Domain Account runs PAM Component Service"

                          New-ADUser -Name svc_MIMMA -DisplayName "MIM Management Agent" -Enabled $false -Path $svcAccounts -SamAccountName svc_MIMMA -AccountPassword $secPwd -UserPrincipalName "svc_MIMMA @priv.contoso.com" -Description "Needed for MIM Portal installation – NOT IN USE"

                              1. Add PRIV-PAM computer account to Grp-gMSA Group to be able to retrieve Group Managed Service Accounts passwords:

                          Add-ADGroupMember -Identity "CN= Grp-gMSA,OU=Service groups,OU=Service Identities,DC=priv,DC=contoso,DC=com" -Members PRIV-PAM$

                              1. Create Group Managed Accounts for SQL

                          New-ADServiceAccount -name svc_SQLService -DNSHostName 'priv-pam.priv.contoso.com ' -PrincipalsAllowedToRetrieveManagedPassword 'Grp-gMSA' -Description 'This Domain Account runs SQL Service'

                          New-ADServiceAccount -name svc_SQLAgent -DNSHostName 'priv-pam.priv.contoso.com ' -PrincipalsAllowedToRetrieveManagedPassword 'Grp-gMSA' -Description 'This Domain Account runs SQL Agent'

                              1. Add SPNs

                          On PRIV-DC in PowerShell execute following commands:

                          setspn -S http/pamportal.contoso.com svc_PAMAppPool

                          setspn -S http/pamapi.contoso.com svc_PAMAppPool

                          setspn -S FIMService/pamsvc.contoso.com svc_PAMWs

                          setspn -S http/pamportal svc_PAMAppPool

                          setspn -S http/pamapi svc_PAMAppPool

                          setspn -S FIMService/pamsvc svc_PAMWs

                              1. Set Kerberos Constrained Delegation

                          On PRIV-DC open Active Directory Users and Computers, in menu select View/Advanced Features, go to Service Identities/Service accounts OU and double click svc_PAMWs account

                          Select “Delegation Tab, Select “Trast this User for Delegation to specific services” and select “Use Kerberos only”

                          Select Add and in new window select same user (svc-PAMWs). Select FIMService and OK and again OK.

                          clip_image002

                          Double click svc_PAMAppPool account

                          Select “Delegation Tab, Select “Trast this User for Delegation to specific services” and select “Use Kerberos only”

                          Select Add and in new window select svc-PAMWs user.

                          Select FIMService and OK and again OK.

                              1. To be able to configure MFA Server we will temporarely establish two-way trust between PROD domain and PRIV domain.

                          netdom trust prod.contoso.com /domain:priv.contoso.com /usero:prod\Administrator /passwordo:"<Administrator Password>" /Twoway /ForestTRANsitive:Yes /add

                              1. Configure Trust to enable SIDHistory

                          netdom trust prod.contoso.com /domain:priv.contoso.com /EnableSIDHistory yes /usero:prod\Administrator /passwordo:"<Administrator Password>"

                          netdom trust prod.contoso.com /domain:priv.contoso.com /Quarantine no /usero:prod\Administrator /passwordo:"<Administrator Password>"

                              1. Configure Audit policies

                          On PRIV-DC open Admin Command Prompt and execute following command:

                          Auditpol.exe /Set /Category:"Account Management","DS Access" /Failure:Enable /Success:Enable

                          gpupdate /force /target:Computer

                              1. Configure access Rights for PAMAdmin to AD

                          Log on to the PRIV-DC as an Enterprise Administrator, open command prompt (Admin) and execute following commands:

                          dsacls "CN=AuthN Policies,CN=AuthN Policy Configuration,CN=Services,CN=Configuration,DC=priv,DC=contoso,DC=com" /g PAMAdmin:RPWPRCWD;;msDS-AuthNPolicy /i:s

                          dsacls "CN=AuthN Policies,CN=AuthN Policy Configuration,CN=Services,CN=Configuration,DC=PRIV,DC=contoso,DC=com" /g PAMAdmin:CCDC;msDS-AuthNPolicy

                          dsacls "CN=AuthN Silos,CN=AuthN Policy Configuration,CN=Services,CN=Configuration,DC=PRIV,DC=contoso,DC=com" /g PAMAdmin:RPWPRCWD;;msDS-AuthNPolicySilo /i:s

                          dsacls "CN=AuthN Silos,CN=AuthN Policy Configuration,CN=Services,CN=Configuration,DC=PRIV,DC=contoso,DC=com" /g PAMAdmin:CCDC;msDS-AuthNPolicySilo

                              1. Create Mailbox for svc_PAMWs account

                            Log on to PROD-EX as a Domain Administrator

                                1. Open PowerShell and execute following commands:

                            Add-PSSnapin Microsoft.Exchange.Management.PowerShell.SnapIn

                            New-Mailbox -Name "PAM Service" -LinkedDomainController "priv-dc.priv.contoso.com" -LinkedMasterAccount "svc_PAMWs@priv.contoso.com" -OrganizationalUnit 'Linked accounts' -UserPrincipalName svc_PAMWs@prod.contoso.com -LinkedCredential:(Get-Credential PRIV.contoso.com\administrator)

                                1. Prepare for SQL Server Installation

                                Log on to PRIV-PAM as a Domain Administrator

                                    1. Install Windows .Net 4.6 and 3.5

                                Install-WindowsFeature NET-Framework-45-Core

                                Install-WindowsFeature NET-Framework-Core -Source "<Windows OS Drive>\sources\sxs"

                                Install-WindowsFeature RSAT-AD-PowerShell

                                    1. Harden accounts

                                On the PRIV-PAM server open Server Manager and from Tools menu select “Local Security Policy”

                                Navigate to “Local Policies\User Right Assignment”

                                Add specified users to appropriate Policies:

                                      1. Access this computer from the network – PAMAdmin, svc_SQLService
                                      2. Adjust memory quotas for a process - svc_SQLService
                                      3. Allow log on locally – PAMAdmin
                                      4. Allow log on through Remote Desktop Services – PAMAdmin
                                      5. Bypass traverse checking - svc_SQLService
                                      6. Deny Log on as a batch job – PAMAdmin
                                      7. Deny Log on as a service – PAMAdmin
                                      8. Deny Log on Locally - svc_SQLService
                                      9. Deny Log on through Remote Desktop Service - svc_SQLService
                                      10. Log on as a Batch Job - svc_SQLService
                                      11. Log on as a service – svc_SQLService
                                    1. Restart PRIV-PAM server

                                Restart-Computer

                                    1. Install Service Accounts

                                Log on to PRIV-PAM as a Domain Administrator

                                On the PRIV-PAM server open PowerShell as an Admin and execute following commands:

                                Install-AdServiceAccount svc_SQLService

                                Install-AdServiceAccount svc_SQLAgent

                                    1. Add PAMAdmin account to Local Administrators Group

                                $group = [ADSI]"WinNT://PRIV-PAM/Administrators,group"

                                $group.psbase.Invoke(“Add”,([ADSI]”WinNT://PRIV/PAMAdmin”).path)

                                    1. Install SQL Server 2016

                                  Log on to PRIV-PAM as a priv\PAMAdmin

                                      1. Create Answer file

                                  Open Notepad and paste following

                                  [OPTIONS]

                                  ACTION="Install"

                                  SUPPRESSPRIVACYSTATEMENTNOTICE="True"

                                  IACCEPTSQLSERVERLICENSETERMS="True"

                                  ENU="True"

                                  QUIET="False"

                                  QUIETSIMPLE="True"

                                  FEATURES=SQLENGINE,FULLTEXT

                                  INSTANCENAME="PAM"

                                  INSTANCEID="PAM"

                                  SQLCOLLATION="SQL_LATIN1_General_CP1_CI_AS"

                                  ; Accounts

                                  SQLSVCACCOUNT="PRIV\svc_SQLService$"

                                  AGTSVCACCOUNT="PRIV\svc_SQLAgent$"

                                  SQLSYSADMINACCOUNTS="PRIV\PAMAdmin"

                                  INDICATEPROGRESS="1"

                                  AGTSVCSTARTUPTYPE="Automatic"

                                  SQLSVCSTARTUPTYPE="Automatic"

                                  SQLTEMPDBFILECOUNT="4"

                                  TCPENABLED="1"

                                  NPENABLED="1"

                                  This will install SQL server in Evaluation mode. You may want to add SQL Server License Key. Just add this line at the end of the above (naturally replace Xes with your license Key):

                                  PID=”XXXX-XXXXX-XXXX-XXXX”

                                      1. Save Answer file to the disk as PAM.Inf
                                      2. Open PowerShell and position to folder where answer file is saved
                                      3. Run following command (replace path to SQL distribution with yours)

                                  C:\Setup\Software\SQL2016\setup.exe /ConfigurationFile=PAM.inf

                                  This will install SQL server on PRIV-PAM machine.

                                      1. Configure SQL Server Networking:

                                  From PowerShell run following commands:

                                  $env:PSModulePath = $env:PSModulePath + ";C:\Program Files (x86)\Microsoft SQL Server\130\Tools\PowerShell\Modules"

                                  Import-Module SQLPS

                                  $wmi = new-object ('Microsoft.SqlServer.Management.Smo.Wmi.ManagedComputer').

                                  $Tcp = $wmi.GetSmoObject("ManagedComputer[@Name=`'PRIV-PAM`']/ServerInstance[@Name=`'PAM`']/ServerProtocol[@Name='Tcp']")

                                  $Tcp.IsEnabled = $true

                                  $wmi.GetSmoObject("ManagedComputer[@Name=`'PRIV-PAM`']/ServerInstance[@Name=`'PAM`']/ServerProtocol[@Name='Tcp']/IPAddress[@Name='IPAll']").IPAddressProperties['TcpPort'].Value='1433'

                                  $wmi.GetSmoObject("ManagedComputer[@Name=`'PRIV-PAM`']/ServerInstance[@Name=`'PAM`']/ServerProtocol[@Name='Tcp']/IPAddress[@Name='IPAll']").IPAddressProperties['TcpDynamicPorts'].Value=""

                                  $Tcp.Alter()

                                  $np = $wmi.GetSmoObject("ManagedComputer[@Name=`'PRIV-PAM'`]/ServerInstance[@Name=`'PAM`']/ServerProtocol[@Name='np']")

                                  $np.IsEnabled = $true

                                  $np.Alter()

                                      1. Restart SQL Server Service
                                      2. Create Firewall rule to allow access to SQL Service

                                  New-NetFirewallRule -Description 'Enables connection to SQL Server' -Enabled True -Name 'AllowSQL' -DisplayName 'Allow SQL' -Protocol Tcp -LocalAddress Any -LocalPort '1433' -RemoteAddress Any -RemotePort Any

                                      1. SQL Management Studio isn’t any longer part of SQL Server distribution and needs to be downloaded separately and installed. To Download and Install SQL Server Management Studio
                                        1. Download SQL Management Studio from

                                  https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-2017

                                        1. From PowerShell run following command (replace path with yours)

                                  & 'C:\Setup\Software\Microsoft SQL Server Management Studio - 18.0 Preview 4\SSMS-Setup-ENU.exe' /install /passive

                                      1. Prepare PAM Server for Installation
                                        1. Install Windows features

                                    On the PRIV-PAM server from PowerShell (Admin) run following commands

                                    Install-WindowsFeature Web-Common-Http

                                    Install-WindowsFeature Web-Static-Content

                                    Install-WindowsFeature Web-Default-Doc

                                    Install-WindowsFeature Web-Dir-Browsing

                                    Install-WindowsFeature Web-Http-Errors

                                    Install-WindowsFeature Web-Http-Redirect

                                    Install-WindowsFeature Web-Asp-Net

                                    Install-WindowsFeature Web-Net-Ext

                                    Install-WindowsFeature Web-ISAPI-Filter

                                    Install-WindowsFeature Web-Http-Logging

                                    Install-WindowsFeature Web-Request-Monitor

                                    Install-WindowsFeature Web-Http-Tracing

                                    Install-WindowsFeature Web-Stat-Compression

                                    Install-WindowsFeature Web-Dyn-Compression

                                    Install-WindowsFeature Web-Basic-Auth

                                    Install-WindowsFeature Web-Windows-Auth

                                    Install-WindowsFeature Web-Digest-Auth

                                    Install-WindowsFeature Web-Filtering

                                    Install-WindowsFeature Web-Mgmt-Console

                                    Install-WindowsFeature Web-Scripting-Tools

                                    Install-WindowsFeature Web-Mgmt-Compat

                                    Install-WindowsFeature Net-Framework-Features

                                    Install-WindowsFeature Web-Server

                                    Install-WindowsFeature Web-WebServer

                                    Install-WindowsFeature Web-App-Dev

                                    Install-WindowsFeature Web-Health

                                    Install-WindowsFeature Web-Security

                                    Install-WindowsFeature Web-Performance

                                    Install-WindowsFeature Web-Mgmt-Tools

                                    Install-WindowsFeature Web-Metabase

                                    Install-WindowsFeature NET-HTTP-Activation

                                    Install-WindowsFeature NET-Non-HTTP-Activ

                                    Install-WindowsFeature NET-WCF-Pipe-Activation45

                                    Install-WindowsFeature NET-WCF-HTTP-Activation45

                                    Install-WindowsFeature Web-Asp-Net45

                                    Install-WindowsFeature Web-Net-Ext45

                                        1. Set Local Policies – harden accouns

                                    On the PRIV-PAM server open Server Manager and from Tools menu select “Local Security Policy”

                                    Navigate to “Local Policies\User Right Assignment”

                                    Add specified users to appropriate Policies

                                          1. Deny access to this computer from the network - svc_PAMMonitor, svc_PAMComponent
                                          2. Deny Log on as a batch job – svc_PAMMonitor, svc_PAMComponent, svc_PAMWs
                                          3. Deny Log on Locally - svc_PAMMonitor, svc_PAMComponent, svc_PAMWs
                                          4. Deny Log on through Remote Desktop Service - svc_PAMMonitor, svc_PAMComponent, svc_PAMWs
                                          5. Log on as a service – svc_PAMMonitor, svc_PAMComponent, svc_PAMWs
                                        1. Reapply Policies

                                    From PowerShell run following command

                                    gpupdate /force /target:Computer

                                        1. Configure IIS

                                    From PowerShell run following command

                                    iisreset /STOP

                                    C:\Windows\System32\inetsrv\appcmd.exe unlock config /section:windowsAuthentication -commit:apphost

                                    iisreset /START

                                        1. Create SQL Aliases

                                    Now we will create two SQL Aliases for SharePoint and for PAM Service. Using SQL Aliases is recommended because makes easier changes of SQL Server

                                    New-Item "HKLM:\Software\Microsoft\MSSQLServer\Client\ConnectTo"

                                    New-ItemProperty -Path "HKLM:\Software\Microsoft\MSSQLServer\Client\ConnectTo" -Name SPSSQL -PropertyType String -Value "DBMSSOCN,PRIV-PAM\PAM"

                                    New-ItemProperty -Path "HKLM:\Software\Microsoft\MSSQLServer\Client\ConnectTo" -Name SVCSQL -PropertyType String -Value "DBMSSOCN,PRIV-PAM\PAM"

                                        1. Configure registry settings for SID History migration

                                    New-ItemProperty –Path HKLM:SYSTEM\CurrentControlSet\Control\Lsa –Name TcpipClientSupport –PropertyType DWORD –Value 1

                                    Restart-Computer

                                        1. Test Connectivity to SQL Server

                                    Log on to PRIV-PAM as a priv\PAMAdmin

                                    Open PowerShell and run following code:

                                    [System.Data.SqlClient.SqlConnection]$SqlConnection = New-Object System.Data.SqlClient.SqlConnection

                                    $SqlConnection.ConnectionString = "Server = SPSSQL; Database = Master; Integrated Security = True;"

                                    $SqlConnection.Open()

                                    Write-Host ("Connection state to SPSSQL is {0}" -f $SqlConnection.State)

                                    $SqlConnection.Close()

                                    $SqlConnection.ConnectionString = "Server = SVCSQL; Database = Master; Integrated Security = True;"

                                    $SqlConnection.Open()

                                    Write-Host ("Connection state to SVCSQL is {0}" -f $SqlConnection.State)

                                    $SqlConnection.Close()

                                    At the end of the outpoot you should see following messages:

                                    Connection state to SPSSQL is Open

                                    Connection state to SVCSQL is Open

                                    Conclusion of Part 2

                                    Now we are ready for the Part 3 - Installing PAM Server.

                                    In this exercise we went step by step through PAM Service prerequisites setup.

                                    In the Part 3 we will set up SharePoint 2016, PAM Service, Portal and PAM.

                                    Until then

                                    Have a great week.

                                     

                                    Disclaimer – All scripts and reports are provided ‘AS IS’

                                    This sample script is not supported under any Microsoft standard support program or service. This sample script is provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of this sample script and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of this script be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use this sample script or documentation, even if Microsoft has been advised of the possibility of such damages.

                                    Viewing all 196 articles
                                    Browse latest View live




                                    Latest Images