There are numerous proprietary score algorithms out there. The newer ones seem to have fixed this bug by factoring history of closed accounts but many online “free credit report” services still use the old ones.
Old algorithms would often penalize account closure due to sudden reductions in average credit age, available credit, or credit mix (any of which might apply to the OP, but especially if that car loan represented a significant portion of their credit history).
Likewise, they would sometimes reward new debt if it significantly increased available credit or added a unique credit type to the mix. For example, a first mortgage could bump a credit score by 30 points or more, even though the individual is no more credit worthy than they were before.
Regardless, I think a good thing to keep in mind is that banks tend to maintain their own internal scoring systems. So not only is there no such thing as “THE score,” but the scores people are referring to when they say that are mostly just one credit bureau’s estimate, based on their proprietary rubric, of how a lender MIGHT see a potential borrower’s likelihood of default.
The banks often use the score by such scoring companies, as those scoring companies have access to all sorts of contracts, bank accounts etc. you have. Wheras you bank only has information provided directly by you.
The scoring companies can have tremendous impact on your life and often they use completely bullshit factors, like your postal code, where you are punished for living in a "poor" neighbourhood or rewards for living in a "good" niegbourhood.
There is also credible reports of ethnical discrimination, e.g. if your name is not a "white" name.
The scoring companies should be obliged to provide full disclourse for how they define a score and banks should be demanded to provide information, if they denied a credit based on such a credit score, with full liability of the scoring company, if their score was using discriminatory criteria.
The banks often use the score by such scoring companies, as those scoring companies have access to all sorts of contracts, bank accounts etc. you have. Wheras you bank only has information provided directly by you.
At least in the US, banks see the full credit report you see, not just a number. Using any of the specific scoring models (FICO X, VantageScore X.0, etc) that are championed by scoring companies or the various US credit bureaus is entirely optional.
The scoring companies can have tremendous impact on your life and often they use completely bullshit factors, like your postal code, where you are punished for living in a "poor" neighbourhood or rewards for living in a "good" niegbourhood.
This is quite a claim. How easy would it be to detect and verify that credit bureaus are using borrowers’ associated addresses substantively in their nationally deployed scoring models? I’d wager a college student with an excel spreadsheet and a one-line mailer could do this in a single semester. Now consider the CFPB auditor, with direct records access. How long would that take?
There is also credible reports of ethnical discrimination, e.g. if your name is not a "white" name.
Again, I respectfully suggest thinking these conspiracies through. Credit reporting agencies are fancy bookies in the end, right? They live and die by the legitimacy of the service they offer. So if one of their scoring models has worse predictive accuracy because it’s evil, few banks will use it. Not even because it’s evil, just because it sucks.
The scoring companies should be obliged to provide full disclourse for how they define a score…
I don’t enjoy defending creditbureaus of all things but conspiracy theories like the ones you’ve described distract from real systemic injustice and disrupt real collective action.
Edit: changed localized phrasing and content so as to not accidentally come across as disrespectful or dismissive.
Well in Germany there is currently an ongoing dispute because the main credit scoring company refuses to disclose the details of its scoring algorithm.
I would also disagree on the banks interests there. The banks don't see the customers they loose because of an overly restrictive scoring model. Also for them to see things like discrimination based on names they first would need to develop a sense to question their own prejudices. Something that white people in Germany, and given the constant issues of racial discrimination, i'd dare to claim in the US too, struggle with extensively.
I want to ask on this though:
At least in the US, banks see the full credit report you see, not just a number.
Do you mean the report provided by the scoring company? Or is there a national register, where all your contracts, credit cards, bank accounts etc. are stored and all banks can access it? That is what i mean. W.o. the scoring company, the bank only knows about the business you have with them and what you disclose to them. They don't know wether you have an overdrawn credit card with a different bank. At least that is how it is in Europe.
I hadn’t heard about the dispute in Germany but found some articles about it. If I’m reading correctly, I would say practices were definitely not more responsible in the US, but the history of disputes here may go back a bit further, with a slew of regulatory reforms we benefit from today — namely FCRA, TILA, BSA (1968-1970), ECOA, FCBA (1974), RFPA (1978), and FACTA (2003).
For them to see things like discrimination based on names they first would need to develop a sense to question their own prejudices.
I definitely agree regarding the universality of blindness to such biases. I suppose automated credit decisions (based purely on scoring models) might have a better shot at eliminating the factor of implicit biases in human agents. Even so, there is a lot of debate over here about how best to filter data that reflects biases and also what data is currently being ignored due to biases, because any algorithm that solves these issues, in part or whole, allows better value capture and increased revenue.
The banks don’t see customers they loose because of an overly restrictive scoring model.
Perhaps, but even if their stakeholders don’t notice or care about discrimination at all, they do take notice when a competitor scoops up a portion of the market they failed to realize due to biased/inferior analysis. After all, the original goal of credit scoring was to increase objectivity and predictive accuracy by reducing bias for the sake of better (more profitable) lending decisions.
To your question re: the credit reporting system in the US, it sounds like it works a bit differently here. There are credit bureaus (sometimes called “reporting agencies”) such as Experian, TransUnion, and Equifax (“the big three”). They functioned historically as lending history aggregators, but now also have scoring models they develop and sell. There are also companies who specialize in scoring models, but they use data from the credit bureaus. These companies often tailor their models to specific markets.
In general, if you apply for any form of credit, the lender formally submits a requisition (a “hard inquiry”) for your file (a “credit report”). Only very specific information is allowed in that credit report, and it must be made available to you at your request (free annually, otherwise for a nominal fee). There are other situations where credit scores can be ordered as a “soft inquiry” without the report, and there are particular rules and restrictions that apply, but the lending history contained in your credit report is what banks and lenders routinely use for applications, even automated/instant credit decisions.