0 Comments

I wanted to try Microsoft Flow for the first time. So I thought of a use-case and came up with a thing that has been bothering me for a while. When I have an epic moment while gaming on my Xbox, I can capture it with a quick double tap on the Xbox icon and the ‘x’ button.

The next step is to get it in the cloud. Because auto-upload is still not a thing yet. (please upvote)

So if the recording is done, you get a notification and keep the xbox icon pressed to put it up on OneDrive.

So it’s in the cloud. But it’s private… I wanted it to be available on YouTube

Auto-Publish Xbox recording on YouTube

This is my use-case! I have seen several options (how ironically) on YouTube about using the YouTube app for Xbox. But I want to auto-upload and have it set to private. So that I only have to occasionally look into my YT account and crop, caption, publish them. But it will be the easiest option, because the files are already at the right destination.

YouTube is limited in FlowAfbeeldingsresultaat voor microsoft flow

The YouTube connector/triggers/actions are limited… So I need to pull in something more externally to get it to work. I hope that they will add more capabilities in the future for flow and YT

flow-youtube-limited

There are no actions, so you can’t upload from Flow to YouTube (yet) I have added it as an idea for flow. Since I decided that using Flow was a requirement (to get to know the product) I made this flow:

flow

Sorry for the Dutch. But it says: when a file is made, make a file on Dropbox.

I have specified the video/xbox dir for onedrive and made a new dir on Dropbox as target.

The final step for automation to YouTube

So Flow lacks some options… I had an old but still active IFTTT account (if this, than that) but that also won’t work. So I searched some more and found Zapier. It appears that I could also leave out flow and just use OneDrive to YouTube there. But I wanted to use Flow, so I kept the Dropbox step. So I configured Zapier to auto upload a new Dropbox file and it’s done.

ZapierAfbeeldingsresultaat voor zapier

I found out that they are hiring, but they use a totally different stack then what I prefer. Winking smile so perhaps it’s something for you dear reader. They have some great docs about creating an Zapier app with the node.js sdk. But as I have said. I like a different stack. So here is a C# approach with webhooks for Zapier. They should/could add (unsupported) C# docs to their official site.


So I finally have a semi auto upload to YouTube. A quick tap-tap on the xbox button on the controller and hit the X to record the last 30 seconds. Send it to Onedrive and from there it’s auto to dropbox and automatically to YouTube (as private) video ready for captioning an publishing it. Maybe someday when I have some spare time, I’d automate it even more or better but for now: good luck and take care!

Pin on pinterest Plus on Googleplus Post on LinkedIn
0 Comments

It’s almost the 25th of May, so it’s a bit of a rush. But I am looking into the required changes for the javascript code of Google Analytics to get it GDPR compliant.

As you probably know by now, GDPR stands for General Data Protection Regulation which should protect the European website visitors more and will probably benefit the privacy of others too.

Google has send an e-mail about that you have to accept a data processing amendment. Which is the first step for GDPR. The second one requires a change in the source code of your website! I thought it is not emphasized enough online. So that is why I wrote this blog.

IP Anonymization in Analytics

It’s required by law to make sure that you do not send the last octet of the IPv4 address (or the last part of IPv6 addresses) from the visitor to Google. Or to have them ignore it actually.


image

source: autoriteit persoonsgegevens – handleiding privacyvriendelijk instellen google analytics maart 2018

However the official Google documentation says that you just have to add a query parameter to the url of the .js file

https://support.google.com/analytics/answer/2763052?hl=en 

the aip parameter should have a one (true)

&aip=1

But it’s also possible that you have some legacy js and have to do it like this line:

_gaq.push (['_gat._anonymizeIp']);

So look into your current analytics JavaScript code and make the change. The pdf of the Dutch authority of personal data also lists that you should make a screenshot of the code and the date and time that the change went live as a proof that you made the change before the law was in state. Which is not solid evidence, so I do not understand why they suggest it.

https://www.google-analytics.com/analytics.js?aip=1 

But good luck making everything in order for the upcoming GDPR!

Pin on pinterest Plus on Googleplus Post on LinkedIn
0 Comments

So I have installed WSL as soon as it was available in the production ring of Windows 10. It is nice/epic etc. you should try it yourself. I used it Format a large XML file fast so that I could actually read it. That was a year ago. So it has been around now for a while.

Scott tweeted that you can pipe '|' stuff between windows/powershell and linux commands! But in the screenshot there was this "wslconfig /list" command. It listed “Legacy (Default)” for me…

Everytime a software developer says “legacy” they should wash their mouth. So I had to remove it!

Scott pointed me to this article about updating the WSL. But I did not want to update it, because I have the latest from the store. That article is also a year old by the way.

So the command you are looking for in case you have a legacy WSL as default is “lxrun /uninstall /full /y

Today (19th of April 2018) there are several options in the store:

Debian GNU/Linux https://www.microsoft.com/store/productId/9MSVKQC78PK6
Ubuntu https://www.microsoft.com/store/productId/9NBLGGH4MSV6
openSUSE Leap 42 https://www.microsoft.com/store/productId/9NJVJTS82TJX
SUSE Linux Enterprise Server 12 https://www.microsoft.com/store/productId/9P32MWBH6CNS
Kali Linux https://www.microsoft.com/store/productId/9P32MWBH6CNS

Try some and enjoy!

Good luck!

Pin on pinterest Plus on Googleplus Post on LinkedIn
0 Comments

There are a lot of benefits with managed disks and it is the preferred way to create a new VM. However because of my tight budget, I wanted to move to unmanaged. I did not expect the costs to continue when the VM was stopped. It was because of the managed storage disks(s).

I found this SO answer of Jason Ye - MSFT.

$sas = Grant-AzureRmDiskAccess -ResourceGroupName "[ResourceGroupName]" -DiskName "[ManagedDiskName]" -DurationInSecond 3600 -Access Read  
$destContext = New-AzureStorageContext –StorageAccountName "[StorageAccountName]" -StorageAccountKey "[StorageAccountAccessKey]"
$blobcopy=Start-AzureStorageBlobCopy -AbsoluteUri $sas.AccessSAS -DestContainer "[ContainerName]" -DestContext $destContext -DestBlob "[NameOfVhdFileToBeCreated].vhd"

I had to create containers in the storage account but I had copied both the OS and the Data disk to the blob storage as .vhd file.

I used this powershell script and the template to create the vhd from blob storage. The datadisk can be added later in the web gui.

  1. Login-AzureRMAccount
  2. Get-AzureRmSubscription
  3. Set-AzureRmContext -SubscriptionName "my subscription name here"
  4. $sas = Grant-AzureRmDiskAccess -ResourceGroupName "resourcegroup" -DiskName "manageddiskname" -DurationInSecond 45000 -Access Read 
    $destContext = New-AzureStorageContext –StorageAccountName "storageaccount" -StorageAccountKey "myprivatekey"
    $blobcopy=Start-AzureStorageBlobCopy -AbsoluteUri $sas.AccessSAS -DestContainer "vhd-containers" -DestContext $destContext -DestBlob "givetheunmanageddiskaname.vhd"
  5. Get-AzureStorageBlobCopyState -Container "vhd-containers" -Blob "givetheunmanageddiskaname.vhd" -Context $destContext –WaitForComplete

My mistake was to use the 3600 value for ‘DurationInSecond’ which is just an hour (60 sec, 60 minutes). The 512 gb datadisk could not be copied to blob storage within an hour (or two). Found out that an hour was also insufficient when I found ‘Get-AzureStorageBlobCopyState’.

Because I already had a vnet from my vm with managed disks, I used this template to create a new vm with the os disk from blob storage: https://github.com/Azure/azure-quickstart-templates/tree/master/201-vm-specialized-vhd-existing-vnet

If you do not have a vnet yet, you should use https://github.com/Azure/azure-quickstart-templates/tree/master/101-vm-from-user-image The deploy to azure button is a useful tool!

image

Once you have a new vm with an unmanaged disk up and running, close it to add the data disk. Once you have done that and have a remote desktop connection, go to disk management and bring the datadisk online again. It took me some time to get my head around the ASM and ARM differences in the powershell tooling. Also because there is now Azure Cli and a cross plat powershell 6.0

The cloud is moving fast, so hop on. Don’t miss out!


Good luck!

Pin on pinterest Plus on Googleplus Post on LinkedIn
0 Comments

I followed the guide but did not got it to work.

It just failed at step 3 and there was no error at all. I thought it had to do with me changing the project from .net Core 1.x to 2.0

image

But it was partially related to the dropdown below it.

The startup object was not set.

I had to set it to “Neo.Compiler.Program”

But after that, the publish did not work because “neon.dll” was missing in some folder.

This GitHub comment pointed me in the right direction:

https://github.com/neo-project/docs/issues/368#issuecomment-362181887

you should copy “neon.dll” manually in that dir. it’s just one dir below.

after that, the publish succeeded and the neon.exe reference was already in the path variable, so the Visual Studio and manually commandline option both worked!

image

https://github.com/neo-project/neo-compiler/issues/90

Good luck building for the Neo blockchain!

NEO (NEO) cryptocurrency

Pin on pinterest Plus on Googleplus Post on LinkedIn