Forever Breathes The Lonely Word - My Tech Blog

There are currently no feeds available for new posts, but then again I don't have that many new posts anyways.

My blog contains only original content created by me, I don't copy other people's stuff and write about things that I come across in my daily work with IT.

Robocopy Directory Failure

Published: 13 November 2018

I was using Robocopy.exe on a live server to backup files to a remote share every day. In task scheduler the robocopy tasks returned with a code of 8, meaning

"Some files or directories could not be copied (copy errors occurred and the retry limit was exceeded). Check these errors further."

and indeed the log file had something like this:

            Total    Copied   Skipped  Mismatch    FAILED    Extras
 Dirs :     13796         1     13794         0         1         0
Files :   4011310     14482   3996828         0         0         3
Bytes : 283.661 g   1.378 g 282.283 g         0         0    33.9 k
Times :   0:15:15   0:08:03                       0:00:00   0:07:12

There is a failed directory copy, but even with full logging enabled, robocopy wouldn't tell me which directory it was.

At this point one option would be to start up Process Monitor and check for failing file access, but this being a busy live server, this wasn't really and option because it does slow down the system a bit.

So I wrote a quick PowerShell one-liner:

ls K:\data\ -Directory | % { robocopy $_.FullName $home *.x6t /NDL /S /NP /R:1 /W:1 /Log+:$Home\test.log}

Note the *.x6t, a file extension that does not exist at all on my server, so nothing is ever copied.

Now I could review the log file and search for something like

      0         1         0
, just above the match line I could see the parent directory, in my case 'export-2017', now I started with that root:

ls K:\data\export-2017 -Directory | % { robocopy $_.FullName $home *.x6t /NDL /S /NP /R:1 /W:1 /Log+:$Home\test2.log}

I reviewed the log file again and continued the same way until I finally found an error message:

2018/11/13 20:29:22 ERROR 5 (0x00000005) Accessing Source Directory K:\data\export-2017\05\20\ Access is denied. 

I fixed the permissions on that directory and from now on they Robocopy process worked just fine.

At the Visual Studio 2017 lunch event here in Singapore I picked up a copy of the VS 2017 Enterprise offline installer files (16.2 GB).

I was told something about certificates I have to install but I ignored that and try to run the installer without an internet connection.

Lo and behold, it failed because it couldn't connect to the internet.

After installing the three *.p12 certificates in the layoutroot\certificates folder, it worked fine without trying to go online.

Well, I don't like to install random certificates into my machine, and looked into what they are:

Open the certificates MMC snapin and navigate to:

Console Root - Current User - Intermediate Certification Autorities - Certificates:

You should be able to see:

File    : layoutRoot\certificates\manifestSignCertificates.p12
msc:    : Microsoft Code Signing PCA 2011
Path    : Cert:\currentuser\CA\F252E794FE438E35ACE6E53762C0A234A2C52135
Subject : CN=Microsoft Code Signing PCA 2011, O=Microsoft Corporation, L=Redmond, S=Washington, C=US

File    : layoutRoot\certificates\manifestCounterSignCertificates.p12
msc:    : Microsoft Time-Stamp PCA 2010
Path    : Cert:\currentuser\CA\2AA752FE64C49ABE82913C463529CF10FF2F04EE
Subject : CN=Microsoft Time-Stamp PCA 2010, O=Microsoft Corporation, L=Redmond, S=Washington, C=US

File    : layoutRoot\certificates\vs_installer_opc.SignCertificates.p12
msc:    : Microsoft Code Signing PCA
Path    : Cert:\currentuser\CA\3CAF9BA2DB5570CAF76942FF99101B993888E257
Subject : CN=Microsoft Code Signing PCA, O=Microsoft Corporation, L=Redmond, S=Washington, C=US

All three are code signing related and are used by the installer to verify the digital signatures of the installation packages. Without these certs, they installer would have to go online to verify to check these certificates.

Because they don't have a key usage of Server Authentication they can not be used to act as a man in the middle for your TLS web traffic.

Furthermore, because they are only in the Cert store for the user who installed Visual Studio, they don't affect other users.

Visual Studio will run without them, so you could delete them, but you may need them if you want to add/remove VS features later while still offline.

Here's an example of some PowerShell I use to install VS 2017

$layoutDir = "X:\Software\vs2017offline"
$wlBase = "Microsoft.VisualStudio.Workload"

Import-PfxCertificate -FilePath "$layoutDir\certificates\manifestCounterSignCertificates.p12" -CertStoreLocation Cert:\currentuser\CA -Verbose
Import-PfxCertificate -FilePath "$layoutDir\certificates\manifestSignCertificates.p12" -CertStoreLocation Cert:\currentuser\CA -Verbose
Import-PfxCertificate -FilePath "$layoutDir\certificates\vs_installer_opc.SignCertificates.p12" -CertStoreLocation Cert:\currentuser\CA -Verbose

&  $layoutDir\vs_enterprise__1481358475.1484949683.exe  --norestart --add $wlBase.NetCoreTools --add $wlBase.NetWeb

Today I played a bit with containers on Windows Server 2016, there is a quick start page on about how to install Docker with step by step instructions. One step involves adding the location of the Docker binaries to your path variable.

The suggested PowerShell code was:

[Environment]::SetEnvironmentVariable("Path", $env:Path + ";C:\Program Files\Docker", [EnvironmentVariableTarget]::Machine)

You see similar code all over the internet to do this, and even my own code I used for years used the same technique which doesn't make it any better.

It kind of works but has several problems.

I point these out using a simplified path variable value.

Consider this my original registry value under HKML\SYSTEM\CurrentControlSet\Control\Session Manager\Environment:

There is a REG_EXPAND_SZ value type with a name of Path and this value:


at this point the value of $env:path is:


now run the suggest code:

[Environment]::SetEnvironmentVariable("Path", $env:Path + ";C:\Program Files\Docker", [EnvironmentVariableTarget]::Machine)

now $env:path looks like what we expect:

C:\Windows\system32;D:\tools;C:\Users\name\AppData\Local\Microsoft\WindowsApps;C:\Program Files\Docker

but lets look at the registry again:

There is now a REG_SZ value:

C:\Windows\system32;D:\tools;C:\Users\name\AppData\Local\Microsoft\WindowsApps;C:\Program Files\Docker

There are three problems here:

  • The location C:\Users\name\AppData\Local\Microsoft\WindowsApps has been copied from a user specific %path% entry to a machine specific one.

  • %SystemRoot% and %ToolsFolder% are now stored as their expanded hard-coded values, not variables.

  • The type of the registry value changed from REG_EXPAND_SZ to REG_SZ

There are three problems with the suggested command, it uses the expanded string in $env:path, not the un-expanded one with its %variables%. $env:Path also has user locations, it merges machine and user path values. And SetEnvironmentVariable always creates a REG_SZ value regardless of the existence of a REG_EXPAND_SZ value.

So what? Is this really a problem? What if I decide to change the environment variable %ToolsFolder% from D:\tools to C:\Util. In the correct setup, everything would work and C:\util would be in the path, but now with this broken path value, none of my tools binaries are found anymore. Any other operation that expects the path value to be REG_EXPAND_SZ may break.

The problem is that using the correct way to add a new location to the path variable requires a few more lines, not just one.

I wrote a small PowerShell script to do this without any of the problems mentioned above, using it you can do:

Set-PathVariable.ps1 -NewLocation "%ProgramFiles%\Docker"

The script is available on my GitHub page.

Tags: Windows

Orphaned IIS APPPOOL accounts

Published: 28 September 2016

In my answer to the question List of Hidden / Virtual Windows User Accounts

I am saying:

Even these lists don't give you every possible account. For example, you can create an application pool FooBarPool then delete it again, you can still use IIS APPPOOL\FooBarPool in the permissions dialog, so there must be an internal list somewhere.

I'm talking about accounts that can be used to set NTFS and other object permissions for.

In this post I'm going to answer the question where the orphaned accounts are stored.

If you create a new IIS Application Pool FooBarPool nothing really happens, but as soon as you run the AppPool for the first time by hitting a site using the pool a new virtual account IIS APPPOOL\FooBarPool is created with an SID of S-1-5-82-3350508232-2665999247-216229732-1971348742-544991869

You can see that SID in the Process Explorer properties for the w3wp.exe process.

This SID is always the same for all AppPools with the name FooBarPool on any computer.

All the IIS APPPOOL\* accounts have the prefix S-1-5-82- and the rest is a SHA-1 hash of then string foobarpool

This account is saved in the registry under:


which lists all accounts used on the local machine and is normally not accessible even to administrators, but you can use:

psexec -s -i regedit.exe

to look at that key.

Now deleting the AppPool FooBarPool doesn't delete the account created for it. Creating a new AppPool with the same name will use the existing account.

So when using:

icacls.exe C:\test /grant "IIS APPPOOL\FooBarPool:(OI)(CI)(M)"

The OS hashes the name to get the SID for that acount and finds it in the registry, even though the AppPool no longer exists.

So can we find out all the Application Pool names that these accounts were created for? I don't think so, only the SID is stored, and while it is easy to get the SID from the name of the pool, it should be impossible to get the name from the SID, because it is a one-way-hash.

Tags: IIS | Security | Windows

Bloated Path Variable in Windows

Published: 25 September 2016

After running a Windows OS for a while and installing some software, the path variable is often filled up with entries you don't really need, run the following line in PowerShell:

($env:path -split ";") | ? {if (Test-Path ([System.Environment]::ExpandEnvironmentVariables("$_"))){Write-Host $_}else{Write-Host $_ -f red}}

It shows all the entries in %path%, on my current system it looks like this:

C:\WINDOWS\system32 C:\WINDOWS C:\WINDOWS\System32\Wbem C:\WINDOWS\System32\WindowsPowerShell\v1.0\ Q:\bin Q:\sbin C:\Program Files\Microsoft SQL Server\110\Tools\Binn\ C:\Program Files\Microsoft SQL Server\110\DTS\Binn\ C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ManagementStudio\ C:\Program Files (x86)\Microsoft SQL Server\110\DTS\Binn\ C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\PrivateAssemblies\ C:\WINDOWS\System32\Windows System Resource Manager\bin C:\WINDOWS\idmu\common C:\Program Files (x86)\Microsoft SDKs\TypeScript\0.9\ C:\Program Files (x86)\Windows Live\Shared C:\Program Files\TortoiseGit\bin C:\Program Files (x86)\Skype\Phone\ C:\Program Files (x86)\ATI Technologies\ATI.ACE\Core-Static C:\Program Files\Microsoft\Web Platform Installer\ C:\Program Files (x86)\Microsoft SQL Server\Client SDK\ODBC\130\Tools\Binn\ C:\Program Files (x86)\Microsoft SQL Server\130\Tools\Binn\ C:\Program Files (x86)\Microsoft SQL Server\130\DTS\Binn\ C:\Program Files (x86)\Microsoft SQL Server\130\Tools\Binn\ManagementStudio\ %USERPROFILE%\.dnx\bin C:\Program Files\Microsoft DNX\Dnvm\ C:\Program Files\Microsoft SQL Server\120\Tools\Binn\ C:\Program Files\Microsoft SQL Server\130\Tools\Binn\ C:\Program Files (x86)\Windows Kits\10\Windows Performance Toolkit\ C:\Program Files (x86)\Microsoft Emulator Manager\1.0\ C:\Users\username\AppData\Local\Microsoft\WindowsApps

WTF, and I'm sure there are worse cases around. Especially SQL Server seems to feel it is very important.

It's not really a big problem to have too many entries in the %path% variable, but I like to keep things lean.

Plus every time I execute a command without a fully qualified path, Windows will look in all these locations to find it. If that program doesn't exist on my computer it takes longer to tell me if there are many location to be searched. Okay, this may only be a few milliseconds, but still.

The red entry above %USERPROFILE%\.dnx\bin means that path doesn't actually exist on my computer.

To clean things up, I decided to just remove most entries and see what happens. First I made a backup, we have to be careful because the way the path variable is presented to the user is not the way it is stored in the registry. This is because the value:

HKLM:\SYSTEM\CurrentControlSet\Control\Session Manager\Environment" -Name\Path

is of type REG_EXPAND_SZ, meaning a string that has any variables in it expanded first and then returned. So %SystemRoot% becomes C:\Windows. You could use regedit.exe to export the whole Environment key, but it stores the path value in an ugly hex-encoded value.

I use this PowerShell code:

($(New-Object -com "WScript.Shell").Environment("System"))["Path"] | Set-Content -Path .\PathVar.txt

Now I removed everything after the first six entries and rebooted my machine.

C:\WINDOWS\system32 C:\WINDOWS C:\WINDOWS\System32\Wbem C:\WINDOWS\System32\WindowsPowerShell\v1.0\ Q:\bin Q:\sbin C:\Users\username\AppData\Local\Microsoft\WindowsApps

The one thing I knew would break is using sqlcmd.exe without the full path. While looking into this I learned that SQL Server 2016 doesn't even come with sqlcmd.exe, not even the SQL Management tools which you have to download separately come with it. You have to download it separately.

Anyways, one way to deal with programs that may expect to be in the path, is to use intermediate batch files, like: git.cmd

X:\PortableGit\cmd\git.exe %1 %2 %3 %4 %5 %6 %7 %8 

or sqlcmd.cmd:

"%ProgramFiles%\Microsoft SQL Server\Client SDK\ODBC\130\Tools\Binn\SQLCMD.EXE" %1 %2 %3 %4 %5 %6 %7 %8 

there are in my Q:\bin directory so I can use git or sqlcmd anywhere and they just redirect the call to the correct location.

Now, over time I will see what else will break and how to fix it which may involve putting certain entries back into the path Variable.

I just don't see the point of having an entry in there for a program that I may use once a year and that I could start just as well using its full path.

Tags: Windows

Let's Encrypt on IIS

Published: 21 September 2016

I had heard about Let's Encrypt quite some time ago but never got around looking at it.

It seems to be a mostly unixy project and earlier on there were no implementations for Windows Server. In February 2016 Rick Strahl wrote a good introduction for IIS users and talked about the different options. Being a PowerShell guy I opted for AcmeSharp which is what the other two options use under the hood anyway.

I followed the Quick Start for one certificate but then began to write my own script to automate the various steps. When halfway done I discovered that this work was already done by Bill Seddon who wrote the script Update-Certificate-http.ps1

Now updating a certificate can be done with a single line:

Update-Certificate-Http -alias "myalias" -domain "" -websiteName "Default Web Site" -checkParameters

From the trenches

While using AcmeSharp I ran into a few small problems, here's how I fixed them.

ExtensionLess static files

When using the http-01 challenge a file without an extension is written to \.well-known\acme-challenge', by default IIS doesn't server extensionless files. To fix this add a newweb.config` in that directory with the following content.

<?xml version="1.0" encoding="UTF-8"?>
            <mimeMap fileExtension="." mimeType="text/plain" />

I tried to put this into my main web.config file but having the path attribute of the location node starting with a dot, killed the whole site.

https only sites

I have a site that only has https bindings, no port 80. The http-01 challenge is always using http://.../.well-known/acme-challenge, so it could not access the file. I tried to temporarily add an http binding but I also have strict transport security set for the site, so allowing http would require lots of changes to the site.

The way I worked around this is by using my honeypot 'catch all' site which handles all requests not handled by any other site on the server. Here I allow http. I added a virtual directory pointing to the correct location under the site in question.

Add-WebConfigurationProperty -pspath 'MACHINE/WEBROOT/APPHOST' -filter "system.applicationHost/sites/site[@name='Honeypot']/application[@path='/']" -name "." -value @{path='/.well-known';physicalPath='%WebSitesDrive%\SiteName\FileRoot\.well-known'}

Using New-WebVirtualDirectory doesn't work because of the dot in the name.

After the challenge had worked, we can remove it:

Remove-WebConfigurationProperty -pspath 'MACHINE/WEBROOT/APPHOST'  -filter "system.applicationHost/sites/site[@name='Honeypot']/application[@path='/']" -name "." -AtElement @{path='/.well-known'}

If you don't have a Honeypot site, just create a site with an http binding for the site in question and the correct virtual directory. Keep the site stopped except when updating your certificates.

Updating all certificates which are about to expire

I wrote a small script that loops through all SSL bindings in IIS and finds those Let's Encrypt certificates that expire in the next x days. It updates those using the Update-Certificate-Http.ps1 script.

This script, Update-LECertificate.ps1 is on GitHub

Scheduling the script

All that is left to do is scheduling the script to run once a week or so.

Luckily the script knows how to do this, just use:

.\Update-LECertificate.ps1 -schedule

This runs the script every Sunday at 3:30 am, for more options on how to schedule the script use:

help .\Update-LECertificate.ps1 -full

Tags: IIS | Security | Web

Nowadays everybody and their brother try to install their own root certificate into your Windows machine.

Sometimes they have a good reason, like Fiddler, but sometimes they don't.

Sir Mark Russinovich's added a feature to his old utitlity sigcheck.exe

using the -tv switch the program checks all certificates in the user or machine store and finds the ones that are not rooted in Microsoft trusted root certificate list.

So Microsoft keeps a master list of all the CA root certificates it trusts, and any certificate on your machine should be signed directly or via an intermediate by one of this certificates.

So the program shows a nice list of certificates that are not and you should review them and either delete them or make a mental note that they are okay.

Sometime in the future you run sigcheck again and you have to do the same.

Wouldn't it be easier to have it running periodically and alert you if any certs show up in the list. Also it would be nice to have a list of exceptions that are not signed by a trusted CA but still trusted by you, maybe because you created them yourself.

So I need something that would run every day, can give me alerts and allows for some configuration.

Luckily, I already had ServerMonitor.ps1 a PowerShell script that I created in 2007 and that has evolved over the years.

It checks various aspects of your Windows OS and alerts you via various methods. It has a plugin architecture, both providers that collect data and logger that send the result can be added by just dropping a file into the correct directory.

So, all I had to do is writing a new ServerMonitor3 provider that calls sigcheck to do the actual work.

Get the required files

Get sigcheck from TechNet, unblock and unzip it and put the files in a directory of your choice.

Go to my downloads page and get ServerManager3. Unblock the file and unzip it. All files should stay together but you can just drop them anywhere you want.

Configure ServerMonitor

In the same directory create a new text file name ServerMonitor3.xml, this is the default configuration file for ServerMonitor, but you could have many different ones if you want.

Edit the file, start with this:

<?xml version="1.0"?>
<servermonitor version="3.0">

    <console enabled="true" />

  <certificates helper="C:\tools\sigcheck64.exe" store="machine" >


We define a single logger, just on the PowerShell console and a single provider the Certificates checker.

Now we have to tell it where sigcheck.exe is located, edit the helper attribute of the certificates node.

Now we can run the script:

  C:\tools\servermonitor3\ServerMonitor3.ps3 -verbose

You have to adjust the path of course, the -verbose switch is just there so we see a little bit more what's going on, you don't need it.

If everything is fine, you will see a green success message, or you see some information about some certificates in yellow.

Now review these using certmgr.msc.

If you find you want to trust certain certificates, add them to your configuration file:

<certificates helper="C:\tools\sigcheck64.exe" store="machine" >
  <allow thumbprint="7E93B6DB9CB2E2D5A412628AE3C55D66DB1DF02620" remark="myCA" /> 
  <allow thumbprint="C6C256DB9CB2EADFA41262E9FCE6DB9CB243DCB381" remark="Corp Root CA" /> 

The next time you run the script, it wont any longer complain about these.

Just having the results shown on the command line is not very helpful, you want to configure additional logger, the email one is most handy, the file one is also nice for logging purposes.

See the ServerMonitor page on how to configure these loggers.

Set up a scheduled task

Finally you want to run ServerMonitor every so often, set up a Windows Scheduled task to run the script.

In the past when I was thinking about elevation of a user I always thought about elevating a member of the administrators group from a Medium Integrity Level to a High one.

But elevation is not only for administrators, it also works for any other user that gets a split security token at login time.

For example any members of the Power Users or Backup Operators groups, have a split-personality as well.

When normally logging on as such a user and run something as admin, the UAC prompt comes up:

UAC Prompt

The wording here is actually incorrect, I don't have to type an administrator password, Joe Block is not an administrator, but his password gets me past the UAC prompt.

whoami /groups as a normal user:

 Group Name                             Type             SID          Attributes
 ====================================== ================ ============ ==================================================
 Everyone                               Well-known group S-1-1-0      Mandatory group, Enabled by default, Enabled group
 BUILTIN\Users                          Alias            S-1-5-32-545 Mandatory group, Enabled by default, Enabled group
 BUILTIN\Power Users                    Alias            S-1-5-32-547 Group used for deny only
 Mandatory Label\Medium Mandatory Level Label            S-1-16-8192

We can see the Power Users group is not in effect: Group used for deny only, any action that requires this membership will fail.

whoami /groups as a elevated user:

 Group Name                                  Type             SID          Attributes
 =========================================== ================ ============ ==================================================
 Everyone                                    Well-known group S-1-1-0      Mandatory group, Enabled by default, Enabled group
 BUILTIN\Users                               Alias            S-1-5-32-545 Mandatory group, Enabled by default, Enabled group
 BUILTIN\Power Users                         Alias            S-1-5-32-547 Mandatory group, Enabled by default, Enabled group
 Mandatory Label\Medium Plus Mandatory Level Label            S-1-16-8448

Now, elevated the second part of the split-token is in effect and we are a proper Power User

whoami /groups as elevated member of Backup Operators:

 Group Name                           Type             SID          Attributes
 ==================================== ================ ============ ==================================================
 Everyone                             Well-known group S-1-1-0      Mandatory group, Enabled by default, Enabled group
 BUILTIN\Remote Desktop Users         Alias            S-1-5-32-555 Mandatory group, Enabled by default, Enabled group
 BUILTIN\Backup Operators             Alias            S-1-5-32-551 Mandatory group, Enabled by default, Enabled group
 Mandatory Label\High Mandatory Level Label            S-1-16-12288

The difference between the last two is that a backup operator gets a integery level of High while the power user only gets Medium Plus (what ever that means).

Now my question, sometimes I need to run an elevated process for such a user while a different (standard user) is logged on to Windows.

It is easy to start a non-elevated process. I can use the (Shift+)context menu to Run as different user, use runas.exe or PowerShell:

start-process -verb runas powershell.exe

only shows me real administrators in the UAC prompt.

I tried other elevation tools, but they all bring up the same UAC prompt.

Even the following doesn't work:

$someCredentials = Get-Credential
Start-Process powershell -Credential $someCredentials -ArgumentList '-noprofile -command &{Start-Process powershell.exe -verb runas}'

I still get a UAC prompt without the non-admin account I want to use.

My UAC level is: Default - Always notify me when: (slider at the top) and I don't want to change that.

The only solution I found so far, only works if I already have an elevated administrator PowerShell running, then I can use:

psexec.exe -u USERNAME -p PASSWORD -d -h -i -accepteula $env:SystemRoot\System32\WindowsPowerShell\v1.0\powershell.exe

I'm using psexec, which has the -h switch meaning: run the process with the account's elevated token, if available. I also have to specify the username and the password for the account.

I think elevating a user should be possible without the help of an administrator but I don't know how. psexec.exe gives me Access Denied if I run it as a non-admin.

IIS - Nested comments in config files

Published: 27 August 2015

One nice feature of XML based configuration is that you can add comments anywhere to explain why a certain configuration value has been set this way.

For IIS I use this most often to comment on the IP addresses I use to allow for certain sites, like:

        <ipSecurity allowUnlisted="false">
            <!-- Susan's laptop -->
            <add ipAddress="" allowed="true"/>
            <!-- public IP at work -->
            <add ipAddress="" allowed="true" />
            <!-- local home network -->
            <add ipAddress="" subnetMask="" allowed="true" />
            <!-- explicit deny Mark's network -->
            <add ipAddress="" subnetMask="" allowed="false" /> 

without these comments I would sometime come back to the configuration and would not know what these addresses are and whether I would still need them.

The other day I had to allow access to a site from everywhere, I could not just change the 'allowUnlisted' value because I have both 'allow' and 'deny' entries in the list.

Normally I would just comment out the whole 'ipSecurity' node, but this isn't possible because XML does not allow nested comments.

My first fix was to move the specific comments out of the node into its own comment section, that works but it's a pain if you have many comments and you are loosing the direct association with the add node.

<!-- ipSecurity info: = Susan's laptop = public IP at work

A cleaner solution is to extend the IIS schema to allow a comment directly on the 'add' node.

To do that I created a new file:
with the following content:
    <sectionSchema name="system.webServer/security/ipSecurity"> 
        <collection addElement="add" >
           <attribute name="remark" type="string" defaultValue=""  />
I'm adding a new attribute to the 'add' node, which allows me to add my comment directly on the node like this:
        <ipSecurity allowUnlisted="false">
            <add ipAddress="" allowed="true" remark="Susan's laptop" />
            <add ipAddress="" allowed="true" remark="public IP at work" />
            <add ipAddress="" subnetMask="" allowed="true" remark="local home network" />
            <add ipAddress="" subnetMask="" allowed="false" remark="explicit deny Mark's network" /> 

This doesn't show up in the IIS Manager UI, but in the configuration editor:

config editor

This means I can edit my comments in the GUI and don't have to edit the config file directly anymore.

If you use that web.config on a different server you have to remember to copy the 'my_schema.xml' file as well, otherwise you will get a '500.19' configuration error complaining:

Unrecognized attribute 'remark'
07-Oct - Adding to the path environment variable
28-Sep - Orphaned IIS APPPOOL accounts
25-Sep - Bloated Path Variable in Windows
21-Sep - Let's Encrypt on IIS
04-Aug - Updating Windows 10 with a different OS language.
01-Aug - Protect your local file backup from Ransomware
30-Jul - Windows Groups
23-Jul - Making sure to use only valid certificate authorities
18-Jul - How to elevate a non-admin user in Windows while logged in as a different user?
27-Aug - IIS - Nested comments in config files
15-Jul - New features in IIS 10
11-Jul - Stopping and removing IIS
11-Jul - Test-WebSite PowerShell script to test an IIS site
02-Jul - Different ways for installing Windows features on the command line
26-Jun - IIS - Managing FTP sites with PowerShell
24-May - Enable Telegram portable without a phone.
22-May - Disable the floppy drive on Windows VMs
19-May - Windows Server Operational Modes
17-May - Fixing PowerShell profile to work in Nano Server
25-Mar - IIS Hardening - Miscellaneous
23-Mar - IIS - Troubleshooting using logs
23-Mar - IIS Hardening - File Permissions
22-Mar - IIS Hardening - File Extensions
22-Mar - IIS Hardening - Handler Mappings, Modules and ISAPI Filters
20-Feb - Clone a KeePass database with new credentials in PowerShell
14-Jan - Some stats based on the Sysinternals sysmon service.

older posts

Pages in this section


ASP.Net | Community | Development | IIS | IT Pro | Security | SQL (Server) | Tools | Web | Work on the road | Windows