Banks and retailers across the country are scrambling prepare for the switch to chip-enabled credit cards later this year, but do you know what the changes mean? Chip cards use a different technology than the traditional magnetic stripe cards we’ve used for years, and are more secure, but not everyone is convinced the upgrade goes far enough in protecting cardholder data.
By October of this year banks and retailers will be required to issue and accept the new chip cards, requiring new banking and POS equipment. These upgrades cost an estimated $500 to $3,000 per payment terminal, for a total estimated cost of $8.65 billion.
The new chip credit cards will continue to require a signature, working just like your current magnetic stripe credit cards. In Europe, where the chip card system has been in place for several years, credit cards require a PIN instead of a signature, which is arguable more secure; US retailers and bankers pushed for signatures because they are more familiar to US customers.
A recent increase in database hacks and stolen credit card data is growing support for the switch, which will make it harder for criminals to copy your card’s information and create a fraudulent copy. Fake cards are 37% of credit card fraud in the US.
What this change won’t protect against, however, are database hacks like what happened at Target and Home Depot. To prevent those hacks would require additional upgrades to an entirely new system at an additional cost of $1,000 to $4,000 per payment terminal, a prohibitive cost for many retailers.
Learn more about the switch to chip-enabled credit cards.