All columns in a database

This is just an aide memoire for a function I use seemingly every week as part of my data integration work. It’s often easier to work through an Excel spreadsheet when you are doing data mapping rather than a SQL tool, as you can search for text and the like to help find a field.

This is just the SQL for doing that for the major SQL databases that you can come across.

Microsoft SQL

select schema_name(tab.schema_id) as schema_name,
tab.name as table_name,
col.column_id,
col.name as column_name,
t.name as data_type,
col.max_length,
col.precision
from sys.tables as tab
inner join sys.columns as col
on tab.object_id = col.object_id
left join sys.types as t
on col.user_type_id = t.user_type_id
order by schema_name,
table_name,
column_id;

Oracle

SELECT *  FROM all_tab_cols ORDER BY table_name, column_name, column_id

DB2

Select c.tabschema as schema_name, c.tabname as table_name, c.colname as column_name, c.colno as position, c.typename as data_type, c.length, c.scale, c.remarks as description, case when c.nulls = ‘Y’ then 1 else 0 end as nullable, default as default_value, case when c.identity =‘Y’ then 1 else 0 end as is_identity, case when c.generated = then 0 else 1 end as is_computed, c.text as computed_formula from syscat.columns c inner join syscat.tables t on t.tabschema = c.tabschema and t.tabname = c.tabname where t.type = ‘T’ order by schema_name, table_name

MYSQL

select column_name from information_schema.columns
where table_schema = ‘your_db’
order by table_name,ordinal_position

Dyslexia and Blog posting

I have dyslexia and while it is often considered a liability, I have never found it so, in fact it makes my life easier, you’re better at maths, you’re better at logical puzzles etc etc

However it does still bite me a little bit when I’m writing blog posts, I simply don’t see works that are just a little wrong, I don’t see where sentences don’t make any sense. The traditional way of getting round this it’s to copy and paste the text you have just written into a new format then reread it but that takes time and often doesn’t work

Enter the latest in the internet age, half the time now for blog posts I simply dictate them and then go back and review them. Gone are the days of dictation software taking ages to learn your voice and making more mistakes than it is worth, this post itself was just done via a phone in a couple of minutes, but even with modern technology mistakes are made, things don’t work out or things simply don’t scan when I read them back, but there is an easy fix, purely by accident I bumped into Amazon Polly. I just thought it was as a cool little WordPress plugin that converted all of your blog posts to speech, but now I’m the main user of it, whenever I do a blog post before publishing it I simply preview it and get Amazon Polly to read it back, it’s amazing how much that clears up bad words bad phrases and non existent punctuation

Give it a try and see what I mean 

 

The best USB-C hub

As just about all computers now use USB-C as their main port, most of the historic Laptop docks have been replaced by generic USB -C hubs, in fact a proper dock is rare to see, and for the majority of the time these turn out to be rubbish in my opinion.

I have tried loads of them and they all have their own failings, be it bad pass through on the power or poor organisation of ports, some are even missing commonly used ports, but the worst offence to me is the fact that nearly all of them seems to insist on a hardwired USB-C Cable to attach them to your Machine, often only 6 inches long meaning you can never place the hub in a nice orderly place for all the cables that need to attach to it.

An otherwise good UBS-C hub ruined.

Thankfully I found one that gets over all of these issues, say hello to the StayGo USB-C Hub,  it looks just like the normal run of the mill dock but this one allows you to have your own USB-C cable, even though it does contain its own built-in short travel cable.

It’s made of metal therefore it has a little bit more weight to it and has nice rubber feet to stop it slipping, this combined with a nice layout and the full complement of needed ports means this is now my go to hub, why don’t more people design stuff like this.

 

 

Noddy Encode SQL

Sometimes you want to scramble or anonymise data in a table , mainly so you can reuse or use ‘real’ data without having real names or details showing, and don’t want to screw up the formatting, so things like emails and phone numbers don’t break as well, as only doing it to certain fields

Yes I know this is a naff way to do it but given the shear number of dbas that do it this way I figured it was worth posting, also note that dont just use this as it is, move some of the replacement values around to give it your own unique values.

UPDATE **table_name** t

SET t.**field_you_want_to_update** = REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(

REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(

REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(

t.**field_you_want_to_update

, ‘a’, ‘g’), ‘b’, ‘w’), ‘c’, ‘c’), ‘d’, ‘x’), ‘e’, ‘f’), ‘f’, ‘v’), ‘g’, ‘b’), ‘h’, ‘y’), ‘i’, ‘m’), ‘j’, ‘u’), ‘k’, ‘e’), ‘l’, ‘p’), ‘m’, ‘l’), ‘n’, ‘a’), ‘o’, ‘t’), ‘p’, ‘z’)

, ‘q’, ‘d’), ‘r’, ‘h’), ‘s’, ‘k’), ‘t’, ‘n’), ‘u’, ‘s’), ‘v’, ‘o’), ‘w’, ‘i’), ‘x’, ‘q’), ‘y’, ‘r’), ‘z’, ‘j’), ‘A’, ‘G’), ‘B’, ‘I’), ‘C’, ‘H’), ‘D’, ‘M’), ‘E’, ‘A’), ‘F’, ‘Z’)

, ‘G’, ‘Y’), ‘H’, ‘L’), ‘I’, ‘B’), ‘J’, ‘K’), ‘K’, ‘J’), ‘L’, ‘N’), ‘M’, ‘C’), ‘N’, ‘T’), ‘O’, ‘R’), ‘P’, ‘S’), ‘Q’, ‘X’), ‘R’, ‘D’), ‘S’, ‘W’), ‘T’, ‘E’), ‘U’, ‘F’), ‘V’, ‘F’)

, ‘W’, ‘P’), ‘X’, ‘U’), ‘Y’, ‘V’), ‘Z’, ‘Q’), ‘0’, ‘9’), ‘1’, ‘3’), ‘2’, ‘6’), ‘3’, ‘7’), ‘4’, ‘1’), ‘5’, ‘2’), ‘6’, ‘3’), ‘7’, ‘0’), ‘8’, ‘4’), ‘9’, ‘5’)

 

FileZilla Download Issue

FileZilla is one of the best FTP clients ever, however while they’re perfect everybody else isn’t

What this means is that if an FTP site isn’t configured correctly then trying to download files that have special characters or accents in them using the FileZilla client will fail, the FileZilla team don’t want to support it because they’re doing the correct thing standards wise (UTF-8 settings on the server) and it’s other people’s failings that are causing the problem but other clients DO support this kind of behaviour eg. firefox which is a shame…

This means that we do need to find a way around the problem with FileZilla, the first sign that something has gone wrong will be in your failed downloads tab

A quick glance at the file usually shows you that there is a special character or accent in it, in this case a “À”

To get round this I found easiest way is merely to select all of the items that have failed right click and export them



This will give you an XML file, open it up and look for the “File” sections and then find and replace the special charaters in the “localFile” section to a none special charater, save the file then import it into fileZilla (file –> Import), you will find that it will download all missing files.

It would be nice if there was an automated version of this but as far as I can tell the FileZilla guys are sticking to standards. 🙁

<File>
    <LocalFile>/mnt/Slow/DownLoads/Anime/Series/Ongoing/Kirakira Precure A La Mode/[anon] KiraKira Precure À La Mode - 47 [1280x720].mkv</LocalFile>
    <RemoteFile>[anon] KiraKira Precure À La Mode - 47 [1280x720].mkv</RemoteFile>
    <RemotePath>1 0 5 Anime 6 Series 7 Ongoing 26 Kirakira Precure A La Mode</RemotePath>
    <Download>1</Download>
    <Size>571273852</Size>
    <ErrorCount>1</ErrorCount>
    <DataType>1</DataType>
</File>

Copying NSF files to the cloud

This is an old tip that I never thought I would use again but has come back to life with the advent of the cloud:

Scenario:

We are migrating multiple servers from onsite to the cloud, the bandwidth of this copy means that it won’t happen within 24 hours and it definitely won’t happen within the maintenance window we have, normally with Lotus Notes migrations from one server to another this wouldn’t be an issue as Domino replication has been a model of stability and ease of use for well over 10 years HOWEVER there are tons and I do mean tons of complex replication settings in this clients setup a lot of them unknown or unremembered to the client, so they have found that using replication means they will miss some of these and after having a look at them I tend to agree so file copy it is..

NSF files tend to be a bit bulky and zip up really rather well so zipping them up before moving them over them makes sense but we don’t want to do one large zip, because 1) The target file system doesn’t have a lot of extra space on it, and 2) The actual copy will take several days so we want to do it one chunk at a time

First let’s get a list of all of the NSF’s files we want to copy over, this will have a double advantage of giving us an indication of numbers and size etc as well as giving us something that we can actually work through so that we can do one group at a time,

dir *.nsf /s /b>f:filelist.txt

Once we have the list then the following little script popped in a batch file and with WinRAR installed on the system will give you an exact mimic of your notes folder structure but with each NSF zipped up and in the correct place

@ECHO OFF
setlocal enableextensions
for /F "tokens=*" %%A in (filelist.txt) do (
    FOR %%i IN ("%%A") DO (
        md D:NSFZIP%%~pi
        "C:Program Files (x86)WinRARRAR.exe" a -r -dh f:NotesZIP%%~pi%%~ni.rar %%A
    )
)

You can then copy them over in whatever method you prefer and unzip as suits you, this method may seem a little Noddy but this is the third time I’ve used it and every time the notes movement has been the easiest part of a migration.

Saleforce Same Code Different Triggers

In Salesforce the same bit of code can be triggered a lot of different ways and with calls to third parties there are different rules for the different ways of calling stuff.

For example take this bit of code, in it we are just passing a contact ID and it is going to go and talk to a third party web service, inside the “setUpRequest” it’s going to update the third party with the details of the Salesforce Contact and in return recive some bits and bobs from the third party to update the Saleforce side. Basic syncing between two parties

public class BlogFramework {
    public Static Void UpdateContactFromExternalWebService(String contactID) {
                Http h = new Http();
                HttpRequest request = setUpRequest(contactID);
                HttpResponse response = h.send(request);
    }           
}

 

we want this thing to happen at two different times:

1. When a user manually updates a contact and then just saves it: we want the sync to happen instantly so the user can see immediately what’s happened and what’s been updated.

2. On schedule: The content might not be updated in Salesforce at all, all changes might happen in the third party but the details still have to be kept up to date for reports and views etc.
So this bit of code has to be callable both from a Schedule and from a save Trigger

let’s take the save trigger first, as it is now it won’t work, you will get the error “Callout from triggers are currently not supported.” error if you try, normally you would just pop the annotation “@Future(callout=true)”[1]The “@Future(callout=true)” annotation basically means that the salesforce code does not stop and wait before doing other things this means that calls to third parties does not slow down … Continue reading at the top of this function and that would solve that but as you will see later on we can’t do that so what we’re going to do is have a little wrapper function that has the @future annotation and from that it’s going to call are real function.

@Future(callout=true)
public Static Void UpdateContactFromExternalWebServiceTrigger(String contactID) {
        BlogFramework.UpdateContactFromExternalWebService(contactID);
}   

 

we can then put that wrapper functions in our contact save trigger and everything will work perfectly

trigger ContactTriggerAllEvents on Contact (
    before insert,
    before update,
    //before delete,
    after insert,
    after update
    //after delete,
    //after undelete
    ) 
    {
        for(Contact cnt:Trigger.new)
        {
            BlogFramework.UpdateContactFromExternalWebServiceTrigger(cnt.ID); 
        }        
    }

 

Next comes calling it from a schedule, if we had put the @future annotation on the actual function this would fail because you cannot call a future function from a scheduled action but we dont have that issue now, what you DO have to do is bolt-on the “Database.AllowsCallouts” to your batch object as seen below

global class UpdateFromAzpiral implements Database.Batchable<sObject>, Database.AllowsCallouts{
    // Get all the contacts
    global Database.QueryLocator start(Database.BatchableContext BC){
        return Database.getQueryLocator([SELECT Id FROM Contact]);
    }
    // The executeBatch method is called for each chunk of objects returned from the start function.
    global void execute(Database.BatchableContext BC, List<Contact> scope){
      for(Contact c : scope){
         BlogFramework.UpdateContactFromExternalWebService(c.ID);
      } 
    }
    //The finish method is called at the end of a sharing recalculation.
    global void finish(Database.BatchableContext BC){
    }
}

 

Now your batch object will be allowed to do callouts.
Putting all these bits together means you can have a single function that calls out to third parties that can be triggered from either a Schedule or an ordinary Trigger.

References

References
1 The “@Future(callout=true)” annotation basically means that the salesforce code does not stop and wait before doing other things this means that calls to third parties does not slow down the salesforce UI.

Remote Desktop While Away

In a few weeks I pop off on my first holiday for 2 years, for this trip I really do not want to take my laptop, this would seem odd as the darn things have been practically glued to me for the last 20 Years, but:

  1. Time to learn to take a break
  2. As a Member of LDCVia I need to learn to know when to share the work, and how to hand over.[1]I get told off about this about once a month.
  3. Errr… welll… There are now countries that were once thought to be friendly, that have now shown a distressing habit of being hostile at their borders and now want to take your clients private data away and look at it for “Reasons” and I figured this would be a great dry run for having to travel to such a country again.

But I’m not insane, nor can I leave my clients, sooo I needed to retain access in some way so that I can support as and when it is needed. someone sort of cloud desktop that I can reach from a cheap laptop or tablet seemed like an obvious answer.

Requirements

  1. It must be able to run a form of VMWARE as each of my clients has to their own separate VM machine(s) with separate security.
  2. It must support Android (large and small screen) and Linux as clients.
  3. Must not cost a bleeding fortune.

Contenders

AMAZON Workspaces: This looked a perfect fit to start with and I use AWS for lots of other services including this blog, but the setup was a right faff was slow and cumbersome and then I realised that it would not allow VMWARE or any virtualisation, undaunted I though I would at least check out the performance, only discover that despite stating “just connects from anything” it only meant large android screens (not my phone) and not from Linux at All.. so in the bin it went.(shame really)

VMWARE Remote Console: Wince!!..I have to own to a bit of paranoia here my self, direct access to the Clients VM’s is just too much of a risk from external, it just stops there. sorry VMWARE, I tried to mentally run through the conversation explaining my actions to clients, and none of them went well.

I started to look at Citrix then got a grip and thought, “OH COME ON” its just one machine. Just get another machine load your VM’s on it and get a good secure remote client… cue another 30 mins wasted looking at online hosting and then a small local server and more swearing comes from the office as I realise “JUST USE YOUR BLOODY LAPTOP YOU TWIT YOU WON’T HAVE IT WITH YOU”…. fsss.

So it just comes down to a good remote connection software

The 3 that stood out were:

Logmein : From my point of view easy to discount as it does not support Linux guests but would have been discarded anyway as it has a small company feel which you again make it a difficult sell to clients

Team Viewer : Used by many of my clients and supports a lot of nice security features, but a bit on the pricey side.

VNC Connect : I was attracted to this as I use the free version for my Raspberry Pi’s, I also like the VNC standard and it has been security hardened by many a grumpy sys admin and dev over the years, the VNC Connect platform provided by Real VNC ticks all the boxes.

I was actually hard pressed to pick between Team Viewer and VNC Connect, on paper they provided all the features/platforms and security that I could want, but in the end VNC Connect won though partly because of price (its is £42[2]They do do a cheaper version but I wanted the higher level encryption. a year vs £384) but mainly the fact that the android viewer on team studio does not support the use of a mouse via Android.[3]But I do have to give the Team viewer client credit for handling screen render better than VNC it would have been nice to get the best of both worlds, and I hope VNC improves on that front.

Security

OK, lets get the elephant in the room out of the way, how are we handling security.

Well VNC Connect is nicely paranoid about security so to get to my laptop now requires 2 logons both different 12 digit ones (1 to login to VNC Connect and 1 to get to my laptop ) then each VM is encrypted and requests a login appropriate to the OS used, then of course the normal password for each client to connect via VPN/Programs etc etc.

I ran that though potential questions via any of my clients and it came out OK.

Phyisical Setup

I have attempted to do this kind of thing before and obviously used remote software all of my career, and thus have sworn at my fair share of thin clients, phones, and tablets

Requirements

  1. Lighter than just taking my laptop
  2. Resolution of a decent Screen
  3. Good keyboard
  4. Good mouse
  5. Either not a silly price or very reusable.

I looked at Chrome books but frankly all in the price range were poo, Microsoft surface laptops were too expensive and I’m not in the IOS ecosystem, So an Android Tablet it is, that meant a Nexus or Pixel as few of the other vendors keep the security patches up to date, thankfully the Pixel C was on special offer (most likely due to being replace in the next month or so) that meant I could get a good tablet with a great screen and a very pleasant keyboard, paired with the new Logitech tracker ball I had something that was very usable thank you.



Testing

So I did 2 basic tests, performance and usability

Performance: I connected my laptop to a VPN in Japan, then tethered the tablet to my phone, then connected via the remote and sat down to work, it was totally usable , there was that slight lag on the mouse you get on any remoting software but no more than I get when I VPN into any clients network, and that was going to be the real test, using a VM via another VPN while remoting to the host PC, here I have to say I cheated like a devil, as my laptop is in the comms room of the office where it has access to the AWESOME least line we use (and is physically secure), so in fact it was actually faster and more responsive than normal, which was a more than pleasant surprise

Usability: Not bad, not perfect but not bad, it worked as well as any remote program, with a couple of extra qwerks, it does not re-size the client desktop as it is a genuine KVM rather than creating a new session, which is both good and bad and easy to work around, the other qwerk I’m still working on, is that the top and bottom bars for the Real VNC client triggers very easily and they don’t always retract cleanly without an extra click, I would like it if you could allocate a special gesture to the client to stop it working in modal form, the same as VMWARE client, perhaps a 3 finger swipe from the top or something like that

Conclusion

OK, I have been doing dry runs of leaving my laptop in the office and just taking the tablet and mouse home at night then working on that and I have to say I am now comfortable just going to Japan without the laptop, this setup works…. wish me luck.. :p

References

References
1 I get told off about this about once a month.
2 They do do a cheaper version but I wanted the higher level encryption.
3 But I do have to give the Team viewer client credit for handling screen render better than VNC it would have been nice to get the best of both worlds, and I hope VNC improves on that front.

New Logitech TrackBall

I moved to a trackball instead of a mouse about 22 years ago when I got a twinge of RSI in my shoulder and the best one on the market at the time was the Logitech ps2 Marble TrackMan, since then I have upgraded with each new model resulting in the current M570, a week ago on an off chance and needing a blue tooth trackball I checked the Logitech site, squeaking with surprise I saw the new MX Ergo trackball was just out

A few days later it arrives in my sweaty hands and I discover that someone at Logitech seems to be a mind reader and has added just about everything I wanted. So this review is going to be a bit unbalanced but I will try to point out the good and the bad.,

Good

  • Lots more grip: the M570 had 4 little rubber feet that had a habit of coming off and disappearing into the ether, meaning the mouse would rock slightly until you replaced them, the MX Ergo has a solid rubber foot that nearly glues you to any surface with NO sliding.
  • Multi-Multi connection: the MX ergo uses both Bluetooth and the Logitech Unified Receiver as well as supporting 2 devices swappable with a button click, this means I can swap between my laptop and tablet very easily and with no messing around to re-pair, it is JUST want I wanted and works flawlessly.
  • Easy precision switch: there is another little button that swaps between fast and high precision mode, another feature I have wanted and expect to come in useful.

Bad

  • The Little devil is more than twice as heavy as the M570, which is both good and bad, good in that it feels far more solid to use, bad as in I carry the sod around all the time.
  • Fragile: now I don’t actually think it actually IS any more fragile, but as it feels more complicated I think it could get grubby pretty quick and the base plate would fall off in my bag
  • Expensive: An extra £30~ over the M570 + £20 for the case I feel I needed, makes this a pricy old mouse.




Size and Feel is nearly identical to the M570 (which just shows they got it right the first time) with the addition of the 2 position rocker (Hand flat and hand at 20°)




The underneath of the MX Ergo vs one of my well-worn M570’s you can see the solid foot and absence of battery hatch, the MX Ergo just recharges via USB meaning I no longer have to carry around any spare batteries.



The detachable foot, with sticks on the MX Ergo with 2 magnets, cant say I’m in love with the solution and would like the option to leave it off but there is no denying that it sticks on well and gives both weight and stability.




Given the cost, less rugged nature and “waiting to fall off” base, the investment of a case seemed a prudent Idea I went with a Hard Travel Case by co2CREA

Conculstion

I love this mouse, I REALLY love this mouse, yes it costs more than its predecessor, yes you cant just bung it in your laptop bag anymore, but to use and in particular to use on different platforms, it is just a pleasure, comfortable, responsive and smooth…… Happy Sigh

Unlimited Test Email Addresses

A silly tip that has saved me tons of hours and make clients happy is having an Email domain that has “catch-all” routing on it.

Basically this is having a domain that any address that you use with it automatically routes to a central email address, mine is the “energywins.co.uk” domain, anything you send to that domain ends up at my main address, be it “clientTest1@energywins.co.uk” or “fakeUser200@energywins.co.uk”, this did not used to be that useful when all apps were internal, but in the world of cloud apps and PARTICULARLY with the Salesforce/Pardot world that only allows an email to be registered once it is invaluable and helps you to keep clients separated (they also love to have you use emails address that are specific to them ie “MicrosoftTEST@energywins.co.uk”

This can be done easily with just about any email provider, but I use Gmail for domains as it is easy, fast and cheap[1]Some of my colleges use temp email domains such as http://www.throwawaymail.com but I have found that such things have a habit of being needed again in 6 months when the client comes back for more … Continue reading.

Strangely the Gmail for Domain instructions keep changing, are oddly poor for Google and the setting is buried REALLY deep which I assume means they don’t really want you to do it. so if things change, just search for “Gmail Catch All” in the meantime:

  • From your inbox, Click on the little cog on the right-hand side and select “Manage this Domain”

  • Then select “Apps”

  • Then select “G Suite”

  • Then select “Gmail”

  • Then scroll down to the bottom and select “Advanced Settings”

  • Then scroll nearly down to the bottom and under “Routing” you will see the setting for “Email routing”, change the radio button for “Unknown mailbox account messages” to “Route to catch-all address”, and put in your main email address for this domain.

That’s it, for most people this is not a suitable setting because it just slightly increases your spam content, but for me and anyone who needs a constant stream of individual email address and to not lose track of old ones used one 6 months ago, it’s invaluable.

References

References
1 Some of my colleges use temp email domains such as http://www.throwawaymail.com but I have found that such things have a habit of being needed again in 6 months when the client comes back for more work.