Unless you use your credit card at an ATM to take out cash advances, you've probably never entered a personal identification number when you used one. Since you always have to enter a PIN for your ATM card, you'd think your credit card would have one. In some parts of the world, they do. However, the U.S. credit card industry grew up without PINs and is changing slowly.
First Credit Cards
In the first half of the 20th century, stores started issuing their own cards. With the release of the Diner's Club card in 1950, the American Express card in 1959 and the BankAmericard in 1958 as a single bank card, the credit card industry started to grow. The BankAmericard became a franchised card in 1966 and was the precursor to Visa. There were no electronic terminals back then, so you wouldn't have been able to use a PIN.
Network vs. Cards
In the 1970s and 1980s, U.S.-based credit card companies developed electronic authorization systems. These systems used phone lines to check out credit cards and authorize transactions. This system was a major advance over the old system in which merchants had to call for authorization or look the card up in a book. You could have used a PIN with that system, but ATMs were rare -- only 2,000 in the United States in 1973 -- so people weren't used to entering PINs. And it would have cost more to build a PIN infrastructure into cards and readers.
The U.S. credit card infrastructure is still based around magnetic stripe credit cards. Moving to a PIN-based system required two major changes. First, credit card issuers needed to replace all of their cards with either chip-based cards or contactless cards using radio frequency identification. Most chose the chip-based cards, although contactless cards are also coming into the American market. Creditors were slow to make these changes because of the expense. Second, issuers needed to replace or reprogram credit card readers. And banks didn't necessarily benefit from improving security: In other countries, requiring PINs for credit cards lets issuers shift the liability for fraud to the cardholders, but U.S. laws don't allow this.
Change did come. To limit fraud, credit card networks adopted a chip system as their European counterparts had prior. But unlike their European counterparts, most chip credit cards in the U.S. are currently chip-and-signature. Most credit cards issued are now chip cards, and the majority of retailers use chip cards. Even wireless payment technologies now accept chip cards. Chip-and-pin cards may still be in the future of American credit cards, but most companies now are focusing on the chip-and-signature system.
Smart Cards and PINs
U.S. issuers are just beginning to support the global chip-and-pin standard. Visa began requiring processors to support chip-and-pin cards in 2013. Visa then began to push the technology by gradually making the banks that service the merchants, instead of the cardholders, liable for the costs of fraud. Mastercard followed the Visa timetable and required fuel dispensers to accept PINs by 2017, and other terminals to support them by 2015. Barclaycard is one of the few major credit card companies in the U.S. that is offering its customers chip and pin credit cards. Chip-and-pin debit cards, however, are much more common in the U.S. than chip-and-pin credit cards, and it is likely to stay that way for the foreseeable future. Experts expect that even when the U.S. eventually transitions to chip-and-pin, that signature backups will remain an option. However, those who get chip-and-pin cards should be able to use them immediately.
- Digital Vision/Digital Vision/Getty Images
- Traditional Credit Cards
- What Causes a Credit Card Company to Flag a Card?
- What Is Credit Card Sniffing?
- Do Credit Card Companies Need to Notify You of Limit Changes?
- The Disadvantages of ATMs
- Percentage of Americans With Credit Cards
- What Is a Single-Purpose Credit Card?
- How to Check My Green Dot Amount