Posts

Showing posts from May, 2022

Blogger and Code Highlighting

Image
Just a few quick notes on achieving nice code highlighting for sharing. My preferred method is to use Visual Studio Code to achieve highlighting.  If you have a language extension installed, then VS Code will copy with rich HTML formatting that looks sharp and consistent. But doesn't quite match the original.  Here's a line of SQL: SELECT CONVERT ( DATETIME2 ( 3 ), GETDATE ()) ; And here's how it looks in my VS Code I do wish that the layered bracket highlighting would transfer over, but honestly that's a very small difference, and I give VS Code high marks for quality of export. Not long ago PowerShell code highlighting or SQL code highlighting was not this easy to achieve; since the most readily available editors didn't provide it natively. My secondary method. Code highlighting is from  http://hilite.me/    Style: Monokai  For PowerShell code I made a manual correction, to adjust the hard to read formatting applied to double quotes, semi-colons, e...

Powershell OpenFileDialog

This is my preferred version of a function for calling a Windows GUI OpenFileDialog. Adding a few GUI elements can be the difference between a script for your own use, and a sharable tool that people with less experience (or fear of CLI) can use. This function invokes the standard .NET System.windows.forms tools within a PowerShell context. Function Get-FilesFromDialog () {     Param (     [ Parameter ( Mandatory =$ False )][ String []]$ myInitialDirectory = $ env:USERPROFILE ,     [ Parameter ( Mandatory =$ False )][ String []]$ myTitle = "Open File" ,     [ Parameter ( Mandatory =$ False , )][ String []]$ myFilterString = ""  )  [ System.Reflection.Assembly ]::LoadWithPartialName( “System.windows.forms” ) | Out-Null  $ OpenFileDialog = New-Object System.Windows.Forms.OpenFileDialog  $ OpenFileDialog .initialDirectory = $ myInitialDirectory  $ OpenFileDialog .filter = " $ myFilterString " + “All files (*.*)...

Tome files - digital equivalent of the trusty notebook

Image
 I've been keeping Tomes for a long time now. They are the trusty text files representing my collective experience; the tricks, techniques, and syntaxes for various systems and languages. The name came from a bit of advice in an old article in 2600. Something along the lines of "We may be reaching the point where a notebook and pen can no longer keep up, but technology now allows us to have an always available text file equivalent. Digital Tomes." The name stuck. It calls to mind old handwritten leather bound volumes belonging to scholars, 18th century scientists, philosophers, and other madman. It evokes a bit of science, and fiction, and RPG.  I keep mine as simple text files, but named with a .tome extension which makes them easy to spot.  I tend to have my most important ones constantly open as tabs in Notepad++ and reference them and update them frequently. There's a bit of a style convention to them using dashes to represent nesting and idea breaks. There are se...

LINQ within a foreach()

  An example of how LINQ where clauses save steps. The where can occur right within the declaration of the foreach LAMBDA syntax: foreach (var person in people.Where(n => n.sex == "male")) { ... } Query Syntax: foreach (var person in from person in people where person.sex == "male" select person) { ... } Both syntaxes compile to the same code (Credit to  @YuvalItzchakov   https://stackoverflow.com/a/25412168 )

Almost Boring - 1st week reflections

Image
  I hit my goal of posting 6 times in the first week. So far, I like it. Which of course will only be part of what determines whether or not I stick with it. The blogger platform is easy to use, seems like a decent place to capture ideas I want to remember, and should be good for when I actually have things that I want to share.  The conversational style is a nice way to engage in reflection, bringing together separate threads and trying to turn them into a coherent idea. And ultimately, it feels good to write again. That's something I haven't made time for in a long while. It's not the feeling that I used to get; but in the frustrated and exhausted wreckage of adulthood, with the ashes of inspiration long grown cold... it's as much as I can expect.

Best Practices or Expired Ideas?

 Just some thoughts on an interesting twitter thread.  for most of what we call "best practices", we have little to no evidence they are even effective, much less the "best" way to do something We call them "best practices" to get people off our backs and leave us alone.                  - Adrian Sanabria   Confrontationally and controversially said. (So of course, I like him instantly.) I think a lot of people will miss his main point though.  We can't use "best practices" as a magical excuse for not thinking about what a situation requires. I specifically remember an old boss who would pull the phrase out to shut down conversation on subjects he was not willing to learn the complexities of. It was simply "we've always done it this way" with a fresh whitewash layer of "appeal to authority". @sawaba continues with a call to challenge ourselves: maybe best practices should have some kind of expiration date after which,...

Orchestrator Runbooks and PowerShell literals

This specific strategy hasn't had much field testing yet, so this is a proof-of-concept overview. Our admin who runs System Center Orchestrator realized a logical flaw in a lot of Orchestrator PowerShell examples and templates. Most examples show something like this: # Set script parameters from runbook data bus and Orchestrator global variables # Define any inputs here and then add to the $argsArray and script block parameters below  $DataBusInput1   =   "{Parameter 1 from Initialize Data}" $DataBusInput2   =   "{Global Variable 1}" Taken from  PowerShell & System Center Orchestrator - Best Practice Template - TechNet Articles - United States (English) - TechNet Wiki (microsoft.com) What orchestrator runs that script; it performs inline concatenation of its variables into the PowerShell scri...

Rules of Automation

  Automation has several concrete rules: ·           Automation requires a system o      You can't automate an action, you can automate a process.   ·           The process has to be well defined o      All actions have to be expressible with valid conditional logic      ("If this, … and this, …but not this"). o      Ultimately this means you can't use the word "sometimes" in the process description. o      All needed condition values must be in a ready and readable state before the trigger is given for an automation step.   ·           Automation projects fail because of weak processes, not because of weak technology.

Implementing Least Privilege - Visual Studio & Git

 The process of actually implementing a "least privilege" strategy usually has a few surprises along the way. I'm not going to list out all of our choices, but I am going to note a few things that caught some of the team by surprise. - Permissions and restrictions are likely to be different on computers and accounts within the same department Not everyone has years of support and administration in their history; so, it is worth saying that any time a second user account is involved it is vital to validate permissions.  If there is a resource that you are sharing with your other account, or that multiple people need to work on, you have to be able to see explicitly defined access to have any realistic chance of success with the rest of your strategy. This ultimately is the root cause behind several of the observations below. - Keep your working folders and local repos inside of your user profile structure. It's structured in a way that is consistent, predictable, and ...

Guiding Principles: Quotes from Andrea Goulet on 'Legacy Code'

"Legacy Code is code that doesn't have communication artifacts that let you discern the rationale or intention behind it." "It becomes Legacy Code once it has a long feedback loop, meaning as soon as the documentation is not readily available."  {The relevant application of this principle is that it is not sufficient to make it obvious what the code is doing, for it to be clean and effective code it should also be discernable why it's doing it.} Regarding comments and documentation: "What am I going to need in 6 months to be able to context switch back to what I understand right now? Whatever is needed has to be provided now." Regarding dependencies: "Chaos theory teaches us that you can't build a repeatable process around Legacy Code. The more dependencies that you have in a system, the less likely you are to be able to accurately estimate the outcome." Harvested from: Andrea Goulet  Hanselminutes Technology Podcast - Fresh Air and...

The benefit of excluding Inactive objects in an application.

Should a UI allow or block the use of an object that has been disabled? We recently had to make that decision in our MES project and chose a consistent policy of excluding inactive objects in all the end user pages. Only the administrative tools and forms designed to edit (and reenable) those objects will allow them to be interacted with. Your decision could go either way, but here's why we made that specific choice:  Any good application is going to last for a while. (Based on experience, bad applications last even longer; because no one wants to get involved in rewriting something that couldn't be done right the first time.) With a long-lasting application that has customizable "objects" you're going to eventually wind up with some that have legacy properties and mystery values. In an MES these will be things like Work Cells, Equipment, Materials, Routes, etc.  You know which ones you don't want to touch. You've given the users new maintenance utilities ...

Designing for application security is designing for data validation

Image
 We probably don't talk enough about how good design pays off on multiple targets.  For example: Security is often something that gets begrudging compliance, but well-designed security steps often intersect with good data validation. The most talked about reason for input parameterization and protecting against SQL injection is "so that a malicious user doesn't pull a Bobby Tables on you." The most legitimate real-world reason for input parameterization and anti-injection code is because "if no one can break it on purpose, then it's not going to break on accident because the AS/400 forwards you values with random semi-colons in them." The 2nd most legitimate real-world reason  is that poorly implemented dynamic SQL interferes with the predictability of your queries, so SQL server is likely going to give you better execution plans with effective parameters. Chances are that implementing any good design feature will improve more things than the advertise...