# Install PowerShell 7 and Azure PowerShell ::SetEnvironmentVariable("PATH", $userenv " $home\AzCopy", "User") $userenv = ::GetEnvironmentVariable("Path", "User") AzCopy/*/azcopy.exe | Move-Item -Destination "$home\AzCopy\AzCopy.exe" Invoke-WebRequest -Uri "" -OutFile AzCopy.zip -UseBasicParsingĮxpand-Archive. # winget install Microsoft.ServiceFabricRuntime # winget install Microsoft.azure-iot-explorer # winget install Microsoft.AzureCosmosEmulator Winget install Microsoft.AzureFunctionsCoreTools # winget install Microsoft.AzureStorageEmulator Winget install Microsoft.AzureStorageExplorer This also includes things like the Azure CLI or Azure PowerShell.Īs a reminder, of course, you can also use Azure Cloud Shell, which has already a lot of the tools preinstalled and can be run within Visual Studio Code or the Windows Terminal. Here is an list of Azure tools I installed using WinGet when setting up a new developer or administrator workstation, this might be different for your needs, but it will provide you with an overview on how to do it. If you want to learn more about WinGet and how to get started, check out my blog here. I created a quick list of commands on how you can install Azure Tools using the Windows Package Manager (WinGet) on your Windows 10 or Windows 11 machine.įirst, you will need to install the Windows Package Manager (WinGet), if you don’t have winget already on your machine (it will ship in later versions of Windows by default) you can find my blog here on how to install the Windows Package Manager Winget. The last step loops through the array, writing data into the table.Sometimes you need to set up a fresh developer or admin workstation with all the latest Azure tools available. $processes = get-process | Sort-Object CPU -descending | select-object -first 10 The output from the command is added to the array. I used a simple get-process command that returns the top 10 processes by CPU usage. In this step I gather the data that will be written to the tale. $table = Get-AzureStorageTable -Name $tableName -Context $storageCtx $storageCtx = New-AzureStorageContext -StorageAccountName $storageAccountName -SasToken $sasToken This is what PowerShell will use to log in and update Table Storage # Step 2, Connect to Azure Table Storage $processes = create the connection context and connection string. I also defined the array that will be used for the data writing into table storage. This allows for load balancing as partitions can be split between resources. This is a requirement for Azure Table Storage. This includes the data gathered previously from the Storage Account. Start by defining the variables for your environment. The full version of the script can be found at my GitHub site here. Now that prerequisites are set, let’s move onto the script. At a minimum, the key will need to be scoped for Table storage with write access. The SAS can be scoped to the type of storage, access rights, source IP and can have an expiration date. Next, generate a new Shared Access Signature (SAS) for the Storage Account. Use this key and the storage account name to log into Azure Storage Explorer. Keep this secure, this is essentially a full control password for the storage account. This will be used to view date written to the Table.Īzure Storage Explorer requires an Access Key from the storage account. You will also need Azure Storage Explore. Create a new table in the storage account, that is where the data will be written. First, you will need an Azure Subscription and a storage account. With that, let’s configure the prerequisites. The data I’m writing doesn’t contain a unique value so instead I simply generate a new GUID and use that for the row key. The row key is a unique value added to the row. The partition key is used for load balancing as partitions from the same table can be spread across different resources. This key consists of a time stamp (added on the server side), a partition key and a row key. The key to writing data to Table Storage is that it requires a Primary Key (pun intended.). The project didn’t work out for other reasons, but I did work out how to write data into Azure Table Storage instead from PowerShell. That’s when I got the idea to write to Azure Table Storage instead of to a CSV. The goal was to do real time trending based on the output, but I ran into an issue with file locks as PowerShell and the other program competed for access to the CSV. I worked on a project recently that wrote data from PowerShell into a CSV file.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |