Why it’s time to move our business away from 2020 Fusion design software

My small business specialises in the design, supply and installation of kitchens, bedrooms and bathrooms. With my penchant for computing it’s no surprise we rely heavily on information technology. However, I don’t write and build all the software we use and that’s especially true when it comes to CAD [Computer Aided Design].

On the surface, CAD is really simple. Draw some lines on the screen. Colour in some spaces, print some pictures and maybe export some things for other software. So far, so good. Of course, the devil as always is in the detail. We need our software to draw the pictures based on products and materials we can buy from our suppliers. We need it to generate accurate plans and we need it to generate respectable pictures so our customers can visualise what they’re going to (hopefully) buy. Being the type of business we are, a lot of what we do includes unique and bespoke configurations. ‘Simple’ things like doors shaped to fit a ceiling and exceptional work like the Star Field ceiling.

From shortly after we began our business almost 20 years ago we’ve been using software called ‘Planit’, which then changed it’s name to became ‘Fusion’, which then changed it’s name to become ’20-20 Fusion’ (as Planit were bought out by another CAD company called 2020). This is an industry specific program. Rather than use a generic CAD program which just creates the graphics and images (maybe AutoCad or Sketchup are good comparisons to this), this program lets us choose and add cabinets quickly to the design and then also shows the price for all the items we’ve added. Being industry specific it also is set up to do things we need quickly. For example add and calculate cornice and pelmet lengths and quantities for a room in a few clicks of the mouse. All of these things can be done in products like sketchup, they just take longer.

The ranges we can add in Fusion represent the real world ranges, with accurate and ‘photo realistic’ representations of real products. Indeed, this is the whole reason for using Fusion – the accurate range of products and the prices. Without that, the product is no better than any other generic design program.

We pay over £1,000 a year to 2020 for ‘maintenance and support’ (not the same as the licence to use the software, which was several thousand pound as a one off charge a long time ago) but we’ve come to some disagreement recently over what that actually means. To me, that means they provide the software and updates to that software over time (eg: making sure it works with Windows 10, adding features to keep up with modern technology) and most importantly they’ll keep up to date with changes the manufacturers make to their products, be that new products, new pricing, new shapes of end panel… and so on. Most of the software updates over the last few years have been minor from our view, we simply don’t need bells and whistles as much as we need accurate 3D objects and prices. It’s the accurate 3D objects and prices that have become a problem.

The 2020 Fusion view of the arrangement is;

I understand your frustration with catalogue updates. However the maintenance & support monthly fees include software updates pertaining to the Fusion software, manufacturer catalogues updates are maintained as per the contract agreement with each manufacturer.
Pasquale Brancati, 2020 Director Customer Support & Retention

I don’t understand how that sits alongside the support renewal letter (2013-14 was the first I found) which says things like ‘Full access to our extensive library of supported manufacturers catalogues’ and ‘maintenance and updates to certain selected catalogues’.

I’m also struggling to reconcile that with their brochure, which says things like ‘Instantly produce customer quotes, with all manufacturer items accurately priced, based upon the features and options chosen.’ (page 2) and “We are continuously working with manufacturers to forge new and lasting partnerships, ensuring that our customers benefit from the widest range of options available, delivered in parallel with the release of their product manuals.” (page 6). Without the manufacturer catalogues, there is no useful program.

My experience is 2020 no longer provide accurate and up to date supplier catalogues or even timely corrections to error reports. A few examples:

  • Crown Kitchens ‘Rococo’ range, released almost a year ago (I have, to hand, a price list from May 2015 that offers this range) yet still not in the Fusion catalogue.
  • PWS end panel with a radius front profile, no graphic despite availability for a long time.
  • Daval Kitchens & Bedrooms… Years out of date, though to be fair, Fusion have *just* begun to provide an almost working kitchen catalogue but I’m pretty sure it’s only because Daval have paid them to do so. It still isn’t complete at the time of writing this

Those 3 suppliers are listed in the Fusion marketing brochure (page 6) and yet clearly Fusion are not updating the data. Why?

Well, I started asking questions (it appears I’m not the only one) and eventually I have a reply. Fusion are trying to charge the manufacturers as well as me for keeping the data up to date. Perhaps they always did and perhaps the manufacturers have decided Fusion want to charge too much. This means the updates aren’t done, this means the software is becoming useless. OK, useless is harsh, it’s becoming expensive and ineffective for it’s original use. It is no longer fit for purpose.

Here are extracts of the Fusion response emails

fusionsupport
24 Jan 2016

Thanks for getting in contact with us.
Yes there has not been a update on the PWS Second nature catalogue for a while, this is due to the manufacturer not providing us with the latest information. I am very certain that there is a update due soon, i will double check with my colleague in our data team and get back to you. I have looked through our outstanding data errors and i can see that you reported a missing item from PWS.

From: fusionsupport
Sent: 09 March 2016 10:37

Hello,
Data can you provide some info on the below concern please, as the customer is quite frustrated.

From: Sacha Smiley 
Sent: March-09-16 9:57 AM

Hi,
Sales are currently in discussion with Crown and PWS about updating the Fusion catalogues.
Gary, can you provide a status on Crown, PWS? Daval we’ve had approval for an update on bedrooms, but were instructed to hold off due to the Insight project.
Thanks

From: fusionsupport

Sent: 11 March 2016 11:12

Subject: RE: Yet more catalogues out of date… is it time we move software?

 
Hello,
I still haven’t received an update on this. We are in the midst of losing another customer. We need answers please.
Thanks

From: Gary Court 

Sent: 11 March 2016 12:03

Subject: RE: Yet more catalogues out of date… is it time we move software?

 
Sacha
I met with Crown at KBB he has asked me to look at outstanding SOW as he feels it is expensive and he is also sending me another update to analyse both sow’s will be completed at the same time
PWS I am in talks to use System Six cat and rename ranges as it is all PWS doors
Gary

So, there we have it. The public line always given to me, over the years, has been ‘The manufacturer isn’t supplying us the data’ but the real story is Fusion won’t do the update until the manufacturer pays. The manufacturers won’t tell us this, because I’m sure they don’t want to upset Fusion. Planit has a lot of businesses like mine using their software so it’s very important for manufacturers to have their products available there. Indeed, I wouldn’t consider replacing Daval, PWS or Crown with another supplier unless their products were available within the design software we use.

I don’t have a problem with manufacturers being charged, but I’m not keen on how it’s been hidden from view and I’m very displeased the current situation is that catalogues are out of date and updates and fixes not being carried out. If only the industry had a more open system and manufacturers could more easily create their own data. We have every supplier send us price lists by email, mostly excel with a few using CSV. I then import and reformat these to suit our needs and our business system. If only they could send us a file for design software in the same way. Maybe, one day, we’ll all be using software like Sketchup and manufacturers will send us catalogues directly. Could you imagine if you could only send a spreadsheet to your customer if you paid Microsoft to create and update it for you?

What to do now?
We’re going to move to another software solution. I’m absolutely certain there will be some annoyances with that software – maybe the pictures aren’t as nice, maybe it doesn’t allow us to import 3D objects we’ve designed in sketchup (for our more bespoke solutions), but the truth is I’d rather have the up to date catalogues. So important is the data to my business, we’ll be spending several thousand pounds a year for the alternative software. The data really is the valuable bit.

My biggest concern remains that the system we move too will at some point in the future also deteriorate into not maintaining manufacturer catalogues. We use different software for bathroom design (better 3D objects, no pricing but that isn’t so important in bathroom design) and they too go through phases of being slow to update some catalogues.

Solicitors who spam? That would be Edwards Duthie Solicitors

Ah, the spam wars. I haven’t posted to this topic for a long time. Not because the spam has stopped but because I’ve had other things to occupy me.

I thought I’d blog this bit of spam though, solicitors firm Edwards Duthie who decided to spam my email with the image below. I emailed them twice to ask where they got my email address from but they didn’t reply. I then thought I’d look for the email address of their data controller but it turns out the information commissioner only publishes a postal address to the public register of data controllers.

On their web site, they describe themselves as “With 13 Partners and some 100 staff operating out of two main offices and three satellite offices, we are one of the largest law firms in the East London/West Essex region.” so perhaps I’m naive to have expected a reply to my emails, or for them to follow the law. Or maybe they know the law has no teeth, so it’s OK to spam because it won’t affect their business negatively as much as they will gain from people that reply.

The email was sent via mailchimp. That means I can unsubscribe knowing mailchimp will block any more, perhaps even drop Edwards Duthie as a customer if they have lots of complaints. It also means it got through my spam filters (mailchimp do a good job of keeping spammers from their network) and there’s every chance wherever Edwards Duthie bought my email address from will continue selling it, I’ll continue getting spam and a couple of times a week I’ll be pulling email I want from my spam folder amongst the thousand or so others from people like Edwards Duthie.

edwardsduthiespamimage

Royalty free music and a time lapse video for work

Here’s a great way to start 2016, Win an award for “Best Domestic Bathroom Installer 2015“.
My brother David entered the Geberit Awards, Geberit being a large multinational manufacturer of bathroom products and out of all the entries from all of the UK, he won. We’re very proud 🙂

That prompted us to finish editing a timelapse video of the winning bathroom. Rather than a silent movie sound I went searching for suitable music to accompany the movie and found the track “Pamgaea” by Kevin McCleod. Best of all, the licence to use this sound track was ‘Royalty Free‘ as well as being free of cost on condition it was clearly attributed to the author. That’s very much like the software code I’ve written and shared, although Kevin is a master of his craft, whereas I’m just an amateur coding for fun.

As well as free when attributed, the music can also be licensed for a fee when an attribution is not possible or wanted. Example: Background music when you’re on hold. In my mind I always thought licensing that type of music was expensive, turns out to be a lot less than I expected.

Migrating from phpBB to Google Groups

For many years I’ve run a tiny web site for the village we live and work in. 8 years ago (or maybe more) I added a forum to the site using phpBB, as they say about themselves ‘THE #1 FREE, OPEN SOURCE BULLETIN BOARD SOFTWARE’.

It’s been very good software, regularly updated and very easy to maintain. However, the most interaction I have with the forum now is blocking spam registrations and migrating it to new servers every couple of years. There are only a couple of posts a year now, so I wanted to find a way of reducing my administration workload.

I decided to migrate it to a “google groups” group. Which is just like a forum with less customisation options. I couldn’t find any guides to migrate away from phpBB so I worked out my own method and here’s how I did it, in case you’re trying to do the same.

Steps in short form:
1) Get data from phpBB database tables as CSV file
2) Write script to process CSV file into multiple emails to the group

1) Get data from phpBB database tables as CSV file
I only needed to migrate each topic and all it’s replies. None of the other database content was important to me.
To do this, I wrote a SQL query:

SELECT po.post_subject, po.post_text, po.post_id, po.topic_id, po.post_time, us.username_clean, top.topic_title, top.topic_time
FROM phpbb_users as us, phpbb_posts as po, phpbb_topics as top
WHERE us.user_id = po.poster_id and po.topic_id = top.topic_id
ORDER BY po.topic_id ASC, post_time ASC

Essentially, this takes selected columns from the tables ‘phpbb_users’, ‘phpbb_posts’ and ‘phpbb_topics’. I’m not sure using ‘WHERE’ is very efficient and perhaps ‘INNER JOIN’/’OUTER JOIN’ would be technically better, but mine was a small database and this was more than fast enough for me (58ms for 114 rows).

Then, I saved the result as a CSV file. Opened it in LibreOffice to check. Several of the fields needed some hand editing, remove first line (headers), replacing some html characters, escaping speech marks, etc. I may have been able to fix those when saving the result of the query as CSV but I didn’t have many to do, so hand fix and move on was fastest.

2) Write script to process CSV file into multiple emails to the group

My script language of choice is ruby. Not because it’s any better than anything else, just what I happen to be using lately. I could have done the same in PHP if I spent a little more time on it.

This is the script:


# I saved file as: process.rb
# to run, "ruby process.rb" ... assuming you have ruby installed ;-)
# I had to install Pony from github, which i did using the specific install gem
# gem install specific_install
# gem specific_install -l https://github.com/benprew/pony
#
# If you're reading this later and forget where it came from,
# https://www.steveroot.co.uk/2015/11/migrating-from-phpbb-to-google-groups/
# Share any tips and fixes in the comments there to help others please!

require 'csv'
require 'date'
require 'Pony'

#initialise the topic counters
#some default text for the first email
#you will need to delete this manually in the google groups!
currenttopic = 0
lasttopic = 0
body = "initialise"
subject = "initialise"

CSV.foreach('phpbb_data.csv') do |row|

#get current topic
currenttopic = row[3]

if currenttopic == lasttopic
#This is a reply to the topic, add to the existing body
body = body+""+"n"
body = body+"-----------------------------------------------------"+"n"
body = body+"reply_by_username: "+row[5]+"n"
body = body+"reply_date: "+DateTime.strptime(row[7],'%s').strftime("%d/%^b/%Y")+"n"
body = body+""+"n"
body = body+row[1]+"n"
else
#This is a new topic. SEND the last group of messages
Pony.mail({
:to => [email protected]',
:from => 'YOUR-EMAIL-ADDRESS',
:subject => subject,
:via => :smtp,
:body => body,
:via_options => {
:address => 'smtp.gmail.com',
:port => '587',
:enable_starttls_auto => true,
:user_name => 'YOUR-EMAIL-ADDRESS',
:password => 'YOUR-PASSWORD',
:authentication => :plain, # :plain, :login, :cram_md5, no auth by default
:domain => "YOUR-SENDING-DOMAIN" # the HELO domain provided by the client to the server
}
})

#A message to terminal on every send, nice to know that something is happening!
puts "Sent "+subject

#Reset the body (subject is set only once, no need to clear)
body = ""
#Set subject, create standard header text and set subject for email.

#Set the subject as the topic name
subject = row[6]

#Put some generic header text in place
body = body+"-----------------------------------------------------"+"n"
body = body+"This post was transfered to the google group when the phpbb based forum was shutdown"+"n"
body = body+"You might find relevant information at YOUR-DOMAIN"+"n"
body = body+"This entry includes all replies to the original topic"+"n"
body = body+"-----------------------------------------------------"+"n"
body = body+""+"n"

body = body+"Topic: "+row[6]+"n"

body = body+"created_by_username: "+row[5]+"n"
body = body+"topic_date: "+DateTime.strptime(row[7],'%s').strftime("%d/%^b/%Y")+"n"
body = body+""+"n"
body = body+row[1]+"n"
end
#set the value of last topic ready for the next loop.
lasttopic = currenttopic

end

# These are the fields in order in the CSV. Here for easy reference whilst I coded
# numbers start from zero (so post_subject = row[0])
# "post_subject", "post_text", "post_id", "topic_id", "post_time", "username_clean", "topic_title", "topic_time"

Being very lazy, I didn’t write the code to understand the first pass should *NOT* be emailed to the group, so the first email to the group titled ‘initialise’ will need to be deleted manually.

You will need to enter your own values for: Forum name, your email address, your sending domain. You’ll need a password, but be aware that if you use 2 factor authentication you’ll need to get an app specific password from your apps account.

You will want to customise the text that is added to every email, perhaps correct the spelling of ‘transfered’ too 😉

The script isn’t particularly fast as it connects and sends each email individually. We use google apps and as there weren’t many topics to send it was well within my daily limit of gmail usage. However, if it was higher then I could have sent them directly via smtp. There are instructions for using different email methods on the ‘Pony’ github pages. The other problem I had was errors in the CSV causing the script to stop. For example some replies had no topic name and that made the script error when it encountered them. For me, I had fixed the CSV, deleted the posts already made to the forum, and run the whole script again. For others, you might like to set up a dummy group to send your messages too first to make sure everything works, then delete the dummy group and re-run the script to send messages to the new group.

To test the email messages, I suggest you take a few rows of your CSV file and send them to your own email to check formatting and content.

If you’re wondering what my results looked like, here’s one of the topics with a reply once posted to the google group

Birthday Calculator – in case you don't want to wait a whole year to celebrate being alive

We have a tradition where I live. We celebrate being alive with a party and that party generally coincides with being alive for another 31,557,600 seconds.  31,557,600 seconds happens to be just about equal to a solar year, which is a happy co-incidence as it’s not so easy to remember otherwise.

I decided I could really do with a good excuse to party before that arbitrary unit of time though.  The solution? Write a web application where I can put in my date of birth and it will tell me other dates that I can celebrate on.

Try it for yourself at http://birthday.sroot.eu and it will tell you amazing things like;

  • How old you would be if you were born on Mercury, Venus, Mars and the other planets in our solar system
  • When your next MegaSecond birthday is (so you can have a party when you survive another 1 million seconds of existence)
  • Or for a really big bash, celebrate the very infrequent in our lifetime GigaSecond birthdays.

If you’d like me to add another arbitrary repeating unit of time post a comment.

Virtual PDF Printer for our small office network – a step by step how to

Alternative title: How I got multiple cups-pdf printers on the same server. (I didn’t, but postprocessing let me work around the problem).

Preamble:

I have a small business. For years we’ve been creating PDFs from any computer on our network through a “virtual appliance’ called YAFPC (“Yet Another Free PDF Composer”).

The appliance originally ran on an old PC, then on a server that ran several other virtual machines. It had a neat web interface and would allow PDF printers to be created that would appear on the network for all of our users to use. It had one printer for plain A4 paper, one for A4 paper with a letterhead background, another one for an obscure use of mine, and so on. If you printed to it, it would email you the PDF (for any user, without any extra setup needed per user). It could also put the PDFs on one of our file servers or make them available from it’s built in file server.

If I remember correctly it cost £30 and ran since 2006 right through until today, November 2014. One of my best software investments!

However, Windows 8 came along and it no longer worked. Getting Windows 8 to print to it directly turned out to be impossible.  The program was not going to be updated or replaced with a new version. I managed a short term work around having windows 8 print to a samba printer queue which converted and forwarded to the YAFPC virtual appliance. There were problems, page sizes not be exact and so on but it worked in a fashion.

Roll forward to today when I’ve just got a new network PDF virtual printer working. It wasn’t so easy to do (some 20 hours I guess) so here are my setup notes for others to follow.  The final run through of these notes had it installed and working in about an hour.

These steps assume you know quite a bit about setting up linux servers. Please feel free to use the comments to point out errors or corrections, or add more complete instructions, and I’ll edit this post with the updates.  Also please suggest alternatives methods that you needed to use to meet your needs.

Overview – We are going to create:

  • a new Ubuntu based linux server as a virtual machine
  • Install CUPS, the Common Unix Printing System
  • Install CUPS-PDF, and extension that allows files to be created from the print queue
  • Create a postprocessing script that will run every time CUPS-PDF is used that will customise our PDF’s and send them where we want them (to our users).

Sounds simple, right 🙂

Continue reading “Virtual PDF Printer for our small office network – a step by step how to”

sunspot solr slow in production (fixed by using IP address instead of domain name)

Short version:
————–
In my sunspot.yml I used a FQDN ( solr.rkbb.co.uk ). Solr was slow
When I used the server IP (10.18.6.224). Solr was fast.

Setting the scene (you can skip this bit):
——————————————-
I’ve been slowing working on some improvements to our business system at work. Whilst most of it currently runs on MS Access and MySQL, I’m slowing working on moving bits into Ruby on Rails. One of the most important things our current system does is store prices and descriptions for over 200,000 products. Searching that database is a crucial task.

Searching in Rails turned out to be very easy. Sunspot had it working very quickly on my development machine. I also had it running on my production server using the sunspot_solr gem which is meant for development only (but mines a small business, so that’s fine). However, when the server was restarted sunspot_solr needed to be manually restarted which was a pain. I thought I should probably get around to setting up a real solr server and point my application to there. So far, so good, simply: copy the config from my rails app to my new Solr service , set the servers hostname in solr.yml, commit, deploy, it worked!

The problem – Solr was terribly slow!
——————————————-
Re-indexing was slow. I could tell something wasn’t right. Neither my rails server or my new solr server were under load.
I created a new product instead (so that would appear in the solr index).
That was slow, but it worked. Displaying search results was also slow.

Check the logs – wow! Yep, Solr is the slow bit


Started GET "/short_codes?utf8=%E2%9C%93&search=test" for 10.18.6.3 at 2014-10-01 14:28:03 +0100
Processing by ShortCodesController#index as HTML
Parameters: {"utf8"=>"✓", "search"=>"test"}
Rendered short_codes/_navigation.html.erb (1.0ms)
Rendered short_codes/index.html.erb within layouts/application (6.7ms)
Rendered layouts/_navigation.html.erb (1.3ms)
Completed 200 OK in 20337ms (Views: 10.3ms | ActiveRecord: 1.7ms | Solr: 20321.1ms)

No way should Solr take 20321ms to respond.

I tried the search on the solr admin interface and the response was instant, so I knew that solr wasn’t the problem. It must be my code (as always!).

As solr replies over http, I tried querying it from my rails server command line. Also slow. So… maybe it’s not my code… then I tried pinging the solr server from my rails server:

ping solr.rkbb.co.uk

it said replies were coming back in less than 1ms .. but then I realised they were taking about 3 or 4 seconds between each report.
I tried pinging another server … same effect…
then I tried pinging my office router… reports every second, just as fast as I’m used to seeing it. But this was the first time I’d used an IP address and not a FQDN
Then I tried pinging my solr server by it’s address … reports every second!

So, maybe all I have to do is configure my application to talk to solr via the server IP instead of FQDN…

I tried…


Started GET "/short_codes?utf8=%E2%9C%93&search=test" for 10.18.6.3 at 2014-10-02 11:51:49 +0100
Processing by ShortCodesController#index as HTML
Parameters: {"utf8"=>"✓", "search"=>"test"}
Rendered short_codes/_navigation.html.erb (0.9ms)
Rendered short_codes/index.html.erb within layouts/application (8.4ms)
Rendered layouts/_navigation.html.erb (0.8ms)
Completed 200 OK in 27ms (Views: 12.2ms | ActiveRecord: 1.1ms | Solr: 8.3ms)

… and I fixed it 🙂

Well, solr is working great. Now I need to figure out what’s wrong with using FQDNs in my network.

BT Wifi hotspots intercept and redirect google SSL searches

BT WiFI intercept and redirect SSL I wouldn’t have noticed them doing this except google told me. That does beg the question, if I was buying something on a website and didn’t notice the redirect from HTTPS to HTTP, could other people on the hot spot be snooping my transactions?

I guess BT have a good reason for doing it but this reminds me I must always connect to the internet via a VPN when on a public hotspot.

How to change a folder icon to a picture in Mac OS X

In an earlier post I showed how I set my screenshots to save in a custom folder rather than onto my desktop (I seem to take a lot of screenshots). I also shared a little camera icon that I made for it. One of the comments asked how I changed the folder icon, so I’ve made a 30 second screen recording to show how.

1. Go to web page that has the image you want
2. Right click image (or ctrl + left click)
3. choose Copy image
4. select folder you want to change the icon for (single left click the folder)
5. Press cmd + i keys together (opens the info pane)
6. left click folder icon shown in the top left of the info pane (it will get a blue highlight border)
7. Press cmd + v keys together (this is the shortcut for paste). You’ll see the image will have replaced the folder icon.

VMware consolidated backup missing a catalogue file – fixed!

As always seems to be the case, a routine update of server software becomes a problem. This time it was updating VMware ESXi from 4 to 5. I know, I’m a little behind the times, but it was working, and it’s only a small office server… and I should have left it alone, sigh.

So, shutdown the Virtual Machines, overnight copy them all over the network to my laptop and a handy external disk. – Done.
Note: I probably should have used the VMWare standalone converter to copy them, rather than just copy them direct from the datastore.
This morning, in at 8am, install the new ESXi (having lost two hours ’cause the DVD on the server was playing up).

Start restoring the Virtual Machines. First a non important one… all good

Second, the most important one, our file server…. uh oh.

"The VMware Consolidated Backup source ... has a missing catalog file."

Several hours of trying to fix it, editing files trying different versions of the VMWare standalone importer (which may have helped, I’m not sure),
I solved it by Opening the Virtual Machine in VMware Player,
which spotted the problem (I had the VM disks split across two datastores but I’d saved them into one folder), asked me to tell it where they were, and that fixed it for VMware player, which also meant the importer was happy again.

PS – I also realised why I never upgraded from VMware ESXi 4. Version 5 takes away a lot of the essential functionality from the vSphere software. That makes ‘it not a lot of use’ for me. Still, it was free. So having fixed the import, I’m now waiting to import it back to a fresh install of version 4. At least I finally set up the 3+1 raid 5 (instead of the 2 sets of raid 1 left over from the original disks and an upgrade 2 years ago).