Template Validation Error - Error: Code=InvalidTemplate; Message=Deployment template validation failed:
I am having a bit of trouble deploying the template below:
One of the error message that has thrown:
'The template resource '[concat('nsg-create',copyIndex())]' at line '344' and column '9' is invalid. The api-version '2016-07-01' used to deploy the template does not support 'ResourceGroup' property. Please use api-version '2017-05-10' or later to deploy the template. Please see https://aka.ms/arm-template/#resources for usage details.'.
Code that have been used to deploy.
Context "Template Validation" {
It "Template $here\azuredeploy.json and parameter file passes validation" -TestCases $ParameterFileTestCases {
Param( $ParameterFile )
Update-Module -Name AzureRM -Force
$output = New-AzureRmResourceGroupDeployment -ResourceGroupName $TempValidationRG -Force -Mode Complete -TemplateFile "$here\azuredeploy.json" -TemplateParameterFile "$here\$ParameterFile" -ErrorAction Stop 5>&1
$output.ProvisioningState | Should -Be "Succeeded"
}
}
I have tried changing the API version, however, that still gave me the same error. I do not understand how to solve this issue. I am trying to test the template through pester.
1 answer
-
answered 2018-07-11 06:26
4c74356b41
You need to use another api-version. Try
2017-05-10
this one.Also, try updating the Azure Powershell to latest version.
See also questions close to this topic
-
Can't kill off an .exe file using Powershell V4.0
Please see below. For whatever reason I can't kill off any on the .exe process running. Randomly I have seen this script work which is why I think there is something else going on.
All that I see is the whole script has worked when testing but then I promote to prod it just fires of the email part. Email part is fine.
Will this code guarantee a kill of the processes even if they are hung.
$process = Get-Process -Name "IVR1","IVR2","IVR3" $IVR1path = "C:\IVR1" $IVR2path = "C:\IVR2" $IVR3path = "C:\IVR3" Get-Process $process -ErrorAction SilentlyContinue if ($process) { Get-Process -Name $process | kill -PassThru Start-Sleep -s 5 cd $IVR1path Start-Process ".\IVR1.exe" cd IVR2path Start-Process ".\IVR2.exe" cd IVR3path Start-Process ".\IVR3.exe" cd .. cd .. $From = "IVR1@example.com.au" $To = "myemail@example.com.au" $cc = "myemail@example.com.au" $Subject = "**TEST** - IVR1 has been recovered" $Body = "The IVR has been successfully recovered" $SMTPServer = "mail.example.com.au" Send-MailMessage -From $From -to $To -cc $cc -Subject $Subject -Body $Body -SmtpServer $SMTPServer }
-
How to change scanner DPI settings in powershell?
I am making updates to an old image processing app I wrote back when I first got hired on. One request that I have received is to have a "Scan" button on the app so that images can be both scanned and processed without having to open the Epson Scan Manager or push the button (some of the imaging techs have difficulty reaching their scan button from their seats). I have hacked something together in powershell that does the job and can be easily linked to a button in the python app, but I can't select a value for DPI. Resolution matters for these scans, both for customer facing and programmatic reasons, and they have to be at least 300 DPI, but they always save at a much lower resolution and I can't seem to figure out how to get in and alter the WIA settings for the scanner. I can control the compression once the file is saved but I can't control the resolution the scanner uses when it actually scans the picture. I have located this resource but don't know how to actually implement the changing of these settings. We only work with jpegs and these scanners are only used to scan products, with no filters or masks applied, so it should be pretty simple, but I just need to get this DPI thing figured out. This is what I have so far:
Set-ExecutionPolicy RemoteSigned $deviceManager = new-object -ComObject WIA.DeviceManager $device = $deviceManager.DeviceInfos.Item(1).Connect() $imageProcess = new-object -ComObject WIA.ImageProcess $wiaFormatJPEG = "{B96B3CAE-0728-11D3-9D7B-0000F81EF32E}" foreach ($item in $device.Items) { $image = $item.Transfer() } $Basepath = Join-Path -Path "C:\Users" -ChildPath $env:username $NewPath = Join-Path -Path $BasePath -ChildPath "Pictures\My Scans\scan daemon" $filename = Join-Path -Path $NewPath -ChildPath "Scan {0}.jpg" $index = 0 while (test-path ($filename -f $index)) {[void](++$index)} $filename = $filename -f $index $image.SaveFile($filename)
I can get the scan and save the file, but it always gets saved in low res. This is a problem both because our customers want to see high res images and because my image processing app is expecting images of a certain size, and so won't even work correctly on these images if we were willing to use them. I feel like this should be pretty simple, possibly even a single line of code, but I'm not super familiar with Windows or powershell and am currently at a loss concerning what that line of code is or how to find it.
essentially I just want a way to do this:
SetWIAProperty(scannnerItem.Properties, WIA_HORIZONTAL_SCAN_RESOLUTION_DPI, 300); SetWIAProperty(scannnerItem.Properties, WIA_VERTICAL_SCAN_RESOLUTION_DPI, 300);
in powershell. No matter where I look I can't seem to find a syntax guide for running .net commands in powershell that don't just deal with basic networking.
-
Powershell Test-Path Office 365 PRO PLUS
I want to use Test-Path for Microsoft Office 365 PRO PLUS.
I used this code but I want to go for the executable to make sure it is really installed. Please see the code below:
$Office = "C:\Program Files\Microsoft Office 15"
$testoffice = Test-Path $Office
If ($testoffice -eq $true) {Write-Host "Office 365 exist!"}
Else {Write-Host "Office 365 doesn't exist!"}
Read-Host "Press enter to exit"
Am I using the right directory for it? Is there an executable to make sure the installation went through and not just the folder?
-
Flask not served in Anaconda/Docker Container
Hello (I am going crazy here) I am trying to create docker container with a flask web app + miniconda environment which is then deployed to a container for web apps in azure
With docker build docker build -t ana . the container is successfully created but running it with docker run ana
leads to nothing happening (no error, just running but the flask server is not started)
the application itself is called as print statements are actually displayed
I tried the with PIP before as outlined here: https://www.martinnorin.se/exposing-python-machine-learning-models-using-flask-docker-and-azure/
and it works but I cannot make it work with anaconda......
my environment.yml
name: ana channels: - defaults dependencies: - blas=1.0=mkl - certifi=2018.11.29=py36_0 - click=7.0=py36_0 - flask=1.0.2=py36_1 - intel-openmp=2019.1=144 - itsdangerous=1.1.0=py36_0 - jinja2=2.10=py36_0 - pip=19.0.1=py36_0 - pycparser=2.19=py36_0 - python-dateutil=2.7.5=py36_0 - pytz=2018.9=py36_0 - setuptools=40.7.3=py36_0 - six=1.12.0=py36_0 - werkzeug=0.14.1=py36_0 - wheel=0.32.3=py36_0 - waitress=1.2.0=py36_0 prefix: C:\Users\jji309\AppData\Local\conda\conda\envs\ana
my dockerfile
FROM continuumio/miniconda3 # Set the ENTRYPOINT to use bash # (this is also where you’d set SHELL, # if your version of docker supports this) ENTRYPOINT [ "/bin/bash", "-c" ] # Conda supports delegating to pip to install dependencies # that aren’t available in anaconda or need to be compiled # for other reasons. In our case, we need psycopg compiled # with SSL support. These commands install prereqs necessary # to build psycopg. #RUN apt-get update && apt-get install -y \ # libpq-dev \ # build-essential \ #&& rm -rf /var/lib/apt/lists/* # Use the environment.yml to create the conda environment. ADD environment.yml /tmp/environment.yml WORKDIR /tmp RUN [ "conda", "env", "create" ] ADD . /code # Use bash to source our new environment for setting up # private dependencies—note that /bin/bash is called in # exec mode directly ADD setup.py /code/shared/setup.py WORKDIR /code/shared RUN [ "/bin/bash", "-c", "source activate ana && python setup.py develop" ] ADD setup.py /code/setup.py WORKDIR /code RUN [ "/bin/bash", "-c", "source activate ana && python setup.py develop" ] # We set ENTRYPOINT, so while we still use exec mode, we don’t # explicitly call /bin/bash ADD app.py /code/app.py EXPOSE 5000 CMD [ "source activate ana && exec python /code/app.py" ]
my app.py
from flask import Flask from waitress import serve from flask import request import json app = Flask(__name__) @app.route("/") def hello(): return "Hello World!" @app.route("/do_post", methods=['POST']) def post_method(): json = request.get_json() name = json["name"] age = json["age"] return "Hello {}. You're {} years old.".format(name, age) # This is important so that the server will run when the docker container has been started. # Host=0.0.0.0 needs to be provided to make the server publicly available. if __name__ == "__main__": serve(app,host='0.0.0.0', port=5000)
-
Where is the log file created when debugging Azure Function in Visual Studio
I have a Timer Azure Function which I execute in VS. Right click on the Azure Function project and Debug. The function has an ILogger log.
Inspecting the log object I can see that is has two loggers
- Azure.Functions.Cli.Diagnostics.ColoredConsoleLogger
- Microsoft.Azure.WebJobs.Script.Diagnostics.FileLogger
I also can see that the RootLogPath is %temp%\LogFiles\Application\Functions.
However at that location there is only a "Host" folder. I expected to find a "Function" folder as well with the log file.
Do I need to enable somehow the File Logger? Do I miss anything?
-
Ambari API on HDInsight returns 404
I am trying out some basics on Ambari APIs on an HDInsight cluster. The following requests all return a 404 (Tried both using a browser as well as a REST client)
https://mynewclusterabcd.azurehdinsight.net/ambari/api/v1/clusters https://mynewclusterabcd.azurehdinsight.net/ambari/api/v1/clusters/mynewclusterabcd/hosts https://mynewclusterabcd.azurehdinsight.net/ambari/api/v1/clusters/mynewclusterabcd/services
whereas I am able to get to the Ambari cluster dashboard using the following URL, with the same credentials.
https://mynewclusterabcd.azurehdinsight.net
What could be wrong?
-
How I can directly deploy ARM template as solution template in Azure marketplace
I have written code for ARM template which I've customized the parameters and installed the application on the server. Now I wanted to Deploy this in the Azure marketplace as solution template what I need to do.
{ "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#", "contentVersion": "1.0.0.0", "parameters": { "adminUsername": { "type": "string" }, "adminPassword": { "type": "string" }, "envPrefixName": { "type": "string", "metadata": { "description": "Prefix for the environment (2-5 characters)" }, "defaultValue": "cust1", "minLength": 2, "maxLength": 11 } }, "variables": { "scriptURL": " https://raw.githubusercontent.com/rt7055/simpledevbox1/master/simpledevbox.ps1 ", "VMname": "[parameters('envprefixName')]", "storageAccountName": "[concat(uniqueString(resourceGroup().id),'storageaccountkatalyst')]", "virtualNetworkName": "MyVNET", "vnetAddressRange": "10.0.0.0/16", "subnetAddressRange": "10.0.0.0/24", "subnetName": "Subnet", "subnetRef": "[resourceId('Microsoft.Network/virtualNetworks/subnets', variables('virtualNetworkName'), variables('subnetName'))]", "imagePublisher": "MicrosoftWindowsServer", "imageOffer": "WindowsServer", "imageSku": "2012-R2-Datacenter", "publicIPAddressName": "mypublicIP", "nicName": "myVMnic" }, "resources": [ { "type": "Microsoft.Storage/storageAccounts", "name": "[variables('storageAccountName')]", "apiversion": "2015-06-15", "location": "[resourceGroup().location]", "tags": { "displayName": "[variables('storageAccountName')]" }, "properties": { "accountType": "Standard_LRS" } }, { "type": "Microsoft.Network/publicIPAddresses", "name": "[variables('publicIPAddressName')]", "location": "[resourceGroup().location]", "apiVersion": "2018-10-01", "properties": { "publicIPAllocationMethod": "Dynamic" } }, { "apiversion": "2017-06-01", "type": "Microsoft.Network/virtualNetworks", "name": "[variables('virtualNetworkName')]", "location": "[resourceGroup().location]", "tags": { "displayname": "Virtual Networks" }, "properties": { "addressSpace": { "addressPrefixes": [ "[variables('vnetAddressRange')]" ] }, "subnets": [ { "name": "[variables('subnetName')]", "properties": { "addressPrefix": "[variables('subnetAddressRange')]" } } ] } }, { "apiVersion": "2017-06-01", "type": "Microsoft.Network/networkInterfaces", "name": "[variables('nicName')]", "location": "[resourceGroup().location]", "dependsOn": [ "[resourceId('Microsoft.Network/publicIPAddresses/', variables('publicIPAddressName'))]", "[resourceId('Microsoft.Network/virtualNetworks/', variables('virtualNetworkName'))]" ], "tags": { "displayname": " Server Network Interface" }, "properties": { "ipConfigurations": [ { "name": "Ipconfig1", "properties": { "privateIPAllocationMethod": "Dynamic", "publicIPAddress": { "id": "[resourceId('Microsoft.Network/publicIPAddresses', variables('publicIPAddressName'))]" }, "subnet": { "id": "[variables('subnetRef')]" } } } ] } }, { "apiVersion": "2017-03-30", "type": "Microsoft.Compute/virtualMachines", "name": "[parameters('envprefixName')]", "location": "[resourceGroup().location]", "dependsOn": [ "[resourceId('Microsoft.Storage/storageAccounts/',variables('storageAccountName'))]", "[resourceId('Microsoft.Network/networkInterfaces/',variables('nicName'))]" ], "tags": { "displayname": "My Virtual Machine" }, "properties": { "hardwareProfile": { "vmSize": "Standard_A1" }, "osProfile": { "computerName": "[variables('VMname')]", "adminUsername": "[parameters('adminUsername')]", "adminPassword": "[parameters('adminPassword')]" }, "storageProfile": { "imageReference": { "publisher": "[variables('imagePublisher')]", "offer": "[variables('imageOffer')]", "sku": "[variables('imageSku')]", "version": "latest" }, "osDisk": { "createOption": "FromImage" } }, "networkProfile": { "networkInterfaces": [ { "id": "[resourceId('Microsoft.Network/networkInterfaces',variables('nicName'))]" } ] }, "diagnosticsProfile": { "bootDiagnostics": { "enabled": true, "storageUri": "[concat('http://',variables('storageAccountName'),'.blob.core.windows.net')]" } } }, "resources": [ { "apiVersion": "2017-03-30", "type": "extensions", "name": "config-app", "location": "[resourceGroup().location]", "dependsOn": [ "[concat('Microsoft.Compute/virtualMachines/',variables('VMname'))]" ], "tags": { "displayName": "config-app" }, "properties": { "publisher": "Microsoft.Compute", "type": "CustomScriptExtension", "typeHandlerVersion": "1.9", "autoUpgradeMinorVersion": true, "settings": { "fileUris": [ "[variables('scriptURL')]" ] }, "protectedSettings": { "commandToExecute": "[concat('powershell -ExecutionPolicy Unrestricted -File ', './simpledevbox.ps1')]" } } } ] } ] }
I want to build a solution template for this or let me know how I can place this to the azure marketplace.
-
Example to use a hidden virtual machine offer in a solution template
I'm looking for an example of a solution template mainTemplate.json file which includes a reference to a hidden Azure Marketplace Virtual Machine offer.
In the VM Image References & Disks of the contribution guide in the azure-quickstart-templates, we can see an example with a core platform image and with a public Azure Marketplace Virtual Machine offer, but nothing with a hidden Azure Marketplace Virtual Machine offer.
In the Microsoft Build 2018 "Building Solution Templates and Managed Applications for the Azure Marketplace" session (from 08:10), Patrick is saying that we should "Import as URI" but I'm not sure how this can be achieved.
-
Passing credential to DSC from arm template
I am trying to pass a user credential to my DSC script via arm template.Its seem template is not passing credential correctly to the dsc.
Thanks
-
Copy files from Azure BLOB storage to SharePoint Document Library
I cannot find a way to copy files\folders from Blob storage to a SharePoint document library. So far, I've tried AZCopy and PowerShell:
*AZCopy cannot connect to SP as the destination
*PowerShell works for local files but the script cannot connect to Blob storage ( Blob storage cannot be mapped as a networkdrive)
-
SAS token as a SecureString not working with ARM template deployment using Azure PowerShell
I have a bunch of nested ARM templates meant to be deployed using Azure PS.
The only way to do that is to host those templates in a
Azure blob container
and then generateSAS token
and send these 2 parameters in themain ARM template
(which points to the nested ones).Here is my PS that generates the SAS token:
$SasToken = ConvertTo-SecureString -AsPlainText -Force (New-AzureStorageContainerSASToken -Container $StorageContainerName -Context $StorageAccount.Context -Permission r -ExpiryTime (Get-Date).AddHours(4))
Here are 2 parts of my deployment script which pass the token to the main ARM template:
$Parameters['_artifactsLocationSasToken'] = $SasToken
and
New-AzureRmResourceGroupDeployment -Name ((Get-ChildItem $TemplateFile).BaseName + '-' + ((Get-Date).ToUniversalTime()).ToString('MMdd-HHmm')) ` -ResourceGroupName $ResourceGroupName ` -TemplateFile $TemplateFile ` -TemplateParameterObject $Parameters ` -Force -Verbose ` -ErrorVariable ErrorMessages
Here is the declaration for the receiving parameter to the main ARM template:
"_artifactsLocationSasToken": { "type": "securestring" }
Here is the nested resource template (which happens to be a cosmos db) in the same main ARM template:
{ "apiVersion": "2017-05-10", "dependsOn": [ "[concat('Microsoft.Resources/deployments/', variables('vnetConfig').Name)]" ], "name": "[variables('cosmosDbConfig').Name]", "properties": { "mode": "Incremental", "templateLink": { "uri": "[concat(parameters('_artifactsLocation'), '/', variables('nestedTemplatesFolder'), '/cosmosdb.json', parameters('_artifactsLocationSasToken'))]" }, "parameters": { "cosmosDbConfig": { "value": "[variables('cosmosDbConfig')]" } } }, "type": "Microsoft.Resources/deployments" }
When I run these, I get this error:
Error: Code=InvalidTemplate; Message=Deployment template validation failed: 'The provided value for the template parameter '_artifactsLocationSasToken' at line '16' and column '39' is not valid.'
If I hard code the
SAS token
in the nested template resource (in main template) and change the type fromsecurestring
tostring
, it just works! What is it that I am missing? -
Powershell error Remove-AzureStorageBlob Method not found: 'Void
Getting Error while calling Remove-AzureStorageBlob Powershell
Remove-AzureStorageBlob -Container $ConName -Blob $BlobName -Context $Ctx Remove-AzureStorageBlob : Method not found: 'Void Microsoft.WindowsAzure.Storage.OperationContext.set_StartTime(System.DateTime)'. At line:1 char:1 + Remove-AzureStorageBlob -Container $ConName -Blob $BlobName -Context ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : CloseError: (:) [Remove-AzureStorageBlob], StorageException + FullyQualifiedErrorId : StorageException,Microsoft.WindowsAzure.Commands.Storage.Blob.RemoveStorageAzureBlobComm and
-
Pester should only check function up to certain line
Can I put any kind of condition in
$null = Get-function @Params
to make sure that this will check only between (c and g) of Get-function and return to pester.
Pester script:
Params = @( $b = '' $C = '' ) $null = Get-function @Params $a = ''
Function starts
Function 'Get-function' param = @( $b = '' $C = '' ) a b c d e f g h i j k end
Can I put any kind of condition in first line to make sure that this will check only certain part of Get-function and return to pester?
-
Process moving from one script to another in powershell
MainFunction which i want to check writing another script.
In short i just want to check certain part of the mainfunctiona and go back to pester.
param( [int]$jobCardID = $(Throw '-jobCardID is required'), # VSTS Job Card ID [string]$filePath = $(Throw '-filePath is required'), # Relative path to the file to test [string]$step = $(Throw '-step is required'), # Step (verification / approval) [string]$status = $null, # Status (Approved / Rejected) [string]$jenkinsJobID = $(Throw '-jenkinsJobID is required') # Jenkins Job ID`enter code here` ) $ScriptPath = Split-Path -parent $MyInvocation.MyCommand.Definition Write-Output ("INFO: Script root path is {0}" -f $ScriptPath) . "$ScriptPath\Helpers\Jenkins.ps1" . "$ScriptPath\Helpers\MetaData.ps1" . "$ScriptPath\Helpers\VSTS.ps1" . "$ScriptPath\STA_AOI.ps1" #region Variables Setup $Passed = $true $Errors = @() $ManualCheck = @() $ManualCheckNotRequired = $true $SectionDelimiter = '{0}' -f ("-" * 112) # End Variables Setup
Pester script for middle part of the function.
Context "checking internal metadata parameters" { #test1 start it "ScriptPath returns correct location as well as region scripts" { $Params = @{ jobCardID = 9223 filePath = '..\..\..\DataFiles\Test\Verification test\AOI\Staging\ctrPassTest.L5X' step = 'verification' status = '' jenkinsJobID = '321' } . ./Mainfunction.ps1 @Params #Can I put any condition here that will only allow to go up to 15 lines in actual function. $ScriptPath | Should -be $PSScriptRoot "$ScriptPath\Helpers\Jenkins.ps1" | should -Exist "$ScriptPath\Helpers\MetaData.ps1" | should -Exist "$ScriptPath\Helpers\VSTS.ps1" | should -Exist "$ScriptPath\STA_AOI.ps1" | should -Exist }
So the question is how can i send my process from mainfunction.ps1 line 14 to back in test function to confirm some information. I dont want to check after line 14 in mainfunction.
-
Powershell tests in TravisCI fail due to missing RequiredModules
I'm playing with
powershell
and run my test locally and on TravisCI.Module Manifest
RequiredModules = @('ClipboardText')
Travis Config
Currently I install
powershell
andpester
before running my testsaddons: apt: sources: - sourceline: deb [arch=amd64] https://packages.microsoft.com/ubuntu/14.04/prod trusty main key_url: https://packages.microsoft.com/keys/microsoft.asc packages: - powershell - xclip before_script: - pwsh -Command 'Install-Module -Name Pester -Force -Scope CurrentUser' script: - make test
Makefile
test: pwsh -Command 'Get-childItem -Recurse *.test.ps1 | foreach { Invoke-Pester -EnableExit $$_ }'
Travis Build
Build throws error:
Import-Module : The required module 'ClipboardText' is not loaded. Load the module or remove the module from 'RequiredModules' in the file '/home/travis/build/edouard-lopez/lesspass-powershell/lesspass.psd1'. At /home/travis/build/edouard-lopez/lesspass-powershell/Clipboard.test.ps1:1 char:1 + Import-Module $PSScriptRoot/lesspass.psd1 -Force # force code to be ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : ResourceUnavailable: (/home/travis/bu...l/lesspass.psd1:String) [Import-Module], MissingMemberException + FullyQualifiedErrorId : Modules_InvalidManifest,Microsoft.PowerShell.Commands.ImportModuleCommand
Question
I thought that declaring
RequiredModules
would installClipboardText
thus allowing my test to be executed correctly. If I manually install the moduleClipboardText
locally my test works, but is it the right thing to do on CI and future distribution of my module?