Behind every ❗️❗️❗️🚨🚨🚨URGENT🚨🚨🚨❗️❗️❗️ there is a person who's about to miss a deadline and, instead of working on themselves to prevent that from happening in the future, makes it the developer's deadline to miss
I’m in my first professional role and the first project was completed and aside from my boss I was the only other dev. So I was naturally excited for their (clients) feedback on it.
Well fast forward a couple of months where they really didn’t interact with the application much and then came the queries and then not understanding how to use it. Find boss sets aside 10 days for me to write some documentation with screenshots of all the journeys (free of charge).
Again, tumbleweeds. Then all of a sudden it’s boom emails a plenty.
Can you fix this, this is a major bug kinda emails. Like it isn’t a bug, you don’t know how to use it.
Now we are dumbing down the software to make it more align with what the business is used to, which is fine but even my boss has said (as I over think and want to reply to things instantly) that just because they have come to life doesn’t mean we drop everything else to tend to them now.
Welcome to the professional world where everything is iterative and and 95% of your clients (internal or external) are data illiterate and don't want to learn whatever self service tools you build.
that just normal software development with contacts and waterfall. usually with agile it's meditated to some extend, because with agile the customer is on board and cannot say afterwards i didn't want it.
And the data they want is the entire FY, is 3,000,000 records and they need every single data attribute making the file like 250 MBs. Then you put it in their SharePoint and they get mad they can't just view it in the browser despite the giant "This file is too large to view online, download it" message.
Newspaper: Hackers are announcing a trove of personal data leaked from [company] after a forwarded spreadsheet inadvertently contained more data than the sender realised.
Same feel as "how long is this going to take to pull?" Well I don't know if part of what you're asking for exists, how clean it is, and if can join the data you're talking about, so anywhere from 5 minutes to never?
That's exactly how you should respond. I've been on the requester for some of these and if my team gave me that as a response I'd just say "let me know what you find out or when you know more."
How many widgets have we transferred to acme this year?
Simple enough question right?
But then when you look at the data, each region works with acme's local offices differently. Some transfer using one method, some offices mark the transfer in the system as "other firm". Oh, and we don't even get a data feed from the north west region because they still haven't upgraded their shit so I can request a spreadsheet but it's in a different format than everything else.
Then inevitably Acme has a different number of widgets that have been transfered. Because if a transfer gets kicked back or cancelled, it's easier to just create a new transfer rather than go fix an old one because that process is laborious and requires tons of approvals so they just create a new transfer and send it over.
But yea, 20 minutes should be enough time to get you that before your meeting with Acme.
Man I don't regret leaving this behind at my last job. You start out by doing someone a one-off like "sure I can pull the top 5 promotional GICs broken down by region for your blog article - I love supporting my co-workers!"
Then requests become increasingly esoteric and arcane, and insistent.
You try to build a simple FE to expose the data for them, but you can't get the time approved so you either have to do it with OT or good ol' time theft, and even then there's no replacement for just writing SQL, so you'll always be their silver bullet.
At that point you teach them how to do it themselves. Isn't there a way to give them an account that only has read access so they can't inadvertently screw up the database?
I like that idea, and it actually did work for our Marketing guy (Salesforce has a kind of SQL). Near the end there, I just had to debug a few of his harder errors, or double check a script that was going to be running on production.
Never thought of it for Postres or Mysql, etc, but I suppose there's got to be an easy enough way to get someone access
In Oracle you'd just set up a user that has limited access and give them those credentials. Creating a few views that pulls in the data they want is a bonus.
At work, I am currently dealing with a table that has no primary key, no foreign key, duplicate (almost) serial numbers, booleans stored as strings, and so on. It's a nightmare of a table.
Entity framework is acting like I'm on meth for using such a table.
I have been trying to get people in my area to make their new table generically named, since it's going to be the only table that can map a date range to a different date range, but I'm on holidays now, and they can't imagine anything other than their little project needing this table, so it's going to be named for this one project, and it's columns will be named for the specific data they'll hold :(
Yeah. Luckily the work I am doing is to fix some really bad work that the entire company has been complaining about. So once it's fixed it will hopefully be a little bit more recognition than that. Plus my boss is pretty level headed.
But who fucking knows? There is always the likelihood that people will say things along those lines. And it ain't my job to fight them on that.
No, we have worse. Dates sometimes stored as strings, sometimes as datetimes, and sometimes as integers. There is no consistency, logic, or forethought to the schema.
Me this morning. I'm gonna take a look at why this Jenkins pipeline is failing. This one job starts a dozen others. Half are failing. For different reasons. After starting rewriting a job that someone half assed. Realize the original error was caused by missing input but some are still valid. Still can't figure out why my rewritten program is erroring. Get pulled away because another program did something weird... I completed nothing today but worked a ton.
Basically scripts you can run on the fly to pull calculated data. You can (mostly) treat them like tables themselves if you create them on the server.
So if you have repeat requests, you can save the view with maybe some broader parameters and then just SELECT * FROM [View_Schema].[My_View] WHERE [Year] = 2023 or whatever.
It can really slow things down if your views start calling other views in since they're not actually tables. If you've got a view that you find you want to be calling in a lot of other views, you can try to extract as much of it as you can that isn't updated live into a calculated table that's updated by a stored procedure. Then set the stored procedure to run at a frequency that best captures the changes (usually daily). It can make a huge difference in runtime at the cost of storage space.
A view is a saved query that pretends it's a table. It doesn't actually store any data. So if you need to query 10 different tables, joining them together and filtering the results specific ways, a view would just be that saved query, so instead of "SELECT * FROM (a big mess of tables)" you can do "SELECT * FROM HandyView"
You wanna know why this dashboard takes a full minute to load? It's because it joins every table in the fucking database because some people can't be bothered to look at a separate page for certain information.
Because Jen in accounting doesn't believe in it, and Tom the CIO likes his data stored raw in TXT Amphibious Delineated. Then our biggest client prefers data as Jason so we swapped half of our database to that to speed things up.
But the real problem is high turnover because we don't pay anyone enough to work on things they are proud of. After 2 years we stop doing even 3% COL raises so they go elsewhere. So every 2-4 years each position gets a new opinionated asshole.
our biggest client prefers data as Jason so we swapped half of our database to that
the app I work with currently stores json as the only column in a sql table and it hurts me so very much. like watching someone pick up a screwdriver and try to bash a nail in with the handle.
Nah this is one of those slick work deals, legit all you need to do is list all your entries, order a pivot table, and then you can just arrange your variables to display however you want them to, do a little format pizaz, and voila, here's that "report" you asked for!
Me right now, discovering that our archives data have file sizes... sometimes in bytes, sometimes in kilobytes. I found some pattern this morning thinking I could know which was which, then the pattern collapsed at some date in the past.
Joy.