A question for our UK/EU members concerning electronics

Printer-friendly version

Author: 

Blog About: 

Taxonomy upgrade extras: 

I was reading up about obsolete electronics today, as I am wont to do, when I started looking into SCART connections. It specifies that these were primarily a European thing, and I've honestly never seen one in the US, but reading about them it's obvious they had superior video quality to anything available here in the US prior to component, though for the life of me I can't seem to find information on WHY they didn't pick up in the US.

So, anybody know why? I'd like to know, if just to sate my curiosity.

Melanie E.

Comments

SCART from Wikipedia

Patricia Marie Allen's picture

From Wikipedia, the free encyclopedia

SCART (from Syndicat des Constructeurs d'Appareils Radiorécepteurs et Téléviseurs – Radio and Television Receiver Manufacturers' Association) is a French-originated standard and associated 21-pin connector for connecting audio-visual (AV) equipment.

It is also known as Péritel (especially in France), 21-pin EuroSCART (Sharp's marketing term in Asia), Euroconector[1] or EuroAV. In America, another name is EIA Multiport (an EIA interface).

In the states EIA Multiport (an EIA interface)was apparently available, but never caught on; most likely due to marketing or perhaps licensing.

Hugs
Patricia

Happiness is being all dressed up and HAVING some place to go.
Semper in femineo gerunt
Ich bin eine Mann

That says it all

How often has the US accepted a standard originating from them 'commies' in Europe?

My neighbors in NH were fazed by the SCART lead connecting my VCR to the TV I brought from Europe in the mid 1980's. They were even more amazed by the TELETEXT system I demo'd to them but that is another story. The Subtitling (Close Captioned for those in the US) keyboard I used was completly off the wall.

I wonder if SCART interfaces could handle NTSC 525 line signals. It could handle SECAM though.
Maybe this was one of the reasons it never took off.

Samantha

I read the wikipedia

But I was wondering about more detailed information.
*shrug*
Not a big deal though. Thanks for trying to answer!

Melanie E.

spec's are us

If details are what you want then...

http://martin.hinner.info/vga/scart.html

and

http://www.lpilsley.co.uk/scart.htm

and not forgetting

http://www.js-technology.com/store/info/avguides/scartconnec...
http://utopia.knoware.nl/users/eprebel/SoundAndVision/Engine...

It has mainly been replaced by HDMI. IMHO, this is not all good. There is a noticeable lag when switching to an HDMI connection. This is due to the encryption at both ends deciding it they are allowed to talk to each other. Can take 5-10 seconds.

Samantha

Not exactly what I was looking for :P

I get the general idea of the connection, I was more interested if anyone knew a more detailed history as to why it didn't pick up over here. Considering the major focal point TVs have had in US households since their invention you'd think a superior video input/output would automatically start getting picked up, but instead we were left with inferior inputs? Even S-Video and component didn't really pick up in popularity here until the mid-oughts, at least not in my experience, and standard composite isn't exactly known for its fidelity. I'm just curious as to the reasoning behind it all.

It's not a big deal, but I'm glad to see it's interested more people than I!

Melanie E.

SCART does have one advantage over RCA.

One connector instead of six. Less chance to feed your VCR video output into left audio channel TV output. On the other hand, if you use cheap cable - it is pain in the... Censored... to connect it as it has lots of flimsy flat pins you have to line up, but only one piece of plastic to hold to maneuver those pins into receptacle.
Only reason SCART can seem to have "better picture quality" is that it is European, so it is more likely to transmit PAL or SECAM decoded video signal instead of American NTSC. And NTSC is lower resolution than either or European analog TV standards.

Still here and going fine

Every bit of equipment in the UK that has a video function, like TV, DVD, VCR and so on, has at least one of these in the back even if they now also have HDMI sockets. Many non-video items such as tuners and amps also have them.

They are SD (Standard Definition) connections, and can pass component as well as composite signals (ie S-Video) through as well as stereo audio. (I think the US only has composite.) Of course, being analog they can be "hacked" to steal the signals traveling between the two pieces of equipment you already own so that's why HDMI was invented, among other reasons.

I build, manage and run a mythtv setup here and knowledge of various video standards is essential. Mostly you just plug the two ends into your equipment and the wiring does the rest; it's essentially idiot (ie consumer) proof.

Penny

Marketing

While I'm not familiar with the interface in question it wouldn't be the first time the U.S. consumer chose an inferior standard due to marketing. Back when cassette video tape systems were just coming here from Japan there were two possible standards: Sony's Beta vs Panasonic's VHS. Beta was the obviously superior standard (that's why until the digital switch TV stations and other broadcast operations used the Beta system).

But Sony made a marketing mistake. Thinking the consumer would only need 2-3 tapes that would be continuously reused, they manufactured just enough tapes to meet the expected demand.

Panasonic thought on a bigger scale and made sure plenty of blank tapes were available. When both machines hit the U.S. market the rush for blank tapes left Sony way behind. Panasonic otoh quickly promoted their tape availability and the consumer selected the Panasonic over the superior Sony system.

So I figure the ready availability of the U.S. standards equipment over the EU standards made the outcome almost inevitable, at least that would be my guess. Of course there maybe other factors I'm not aware of, but my gut says the situation I outlined had a lot to do with it.


I went outside once. The graphics weren' that great.

That sounds like a reasonable assumption

And marketing DOES play a major role in Americans, and even the world, going for inferior products *coughiphonecough* when better alternatives are available.

I'm sure Sony was more than happy with the success of their work with Philips on the Blu Ray and its trumping of HD-DVD, though, and more than vindicated over Betamax's underwhelming reception :P

Melanie E.

This was slightly painful to

This was slightly painful to read.

Betamax failed for basically one reason.

predatory licensing.

This is the same reason that minidisc players failed, that Sony's MemoryStick line was never picked up by any product manufacturers except Sony and media reader makers, and _almost_ became a CD failure. Philips managed to strongarm Sony on CD's to force a common standard without predatory licensing - thus, CD's came out fast and stayed. (Sony has never learned from these failures. They just keep trying)

VHS became the standard because the adult movie industry (the largest independent consumer/producers of movies at the time) refused to cough up the fees that Sony wanted for -EVERY SINGLE TAPE-. VHS, on the other hand, was an open standard - because that's what JVC (the creators - not Panasonic aka Matsushita) believed in.

Personally, I'd rather have HD-DVD than Blu-Ray, for the exact same reason. IBM learned its lesson after the MCA architecture fiasco, and decided it was more important to get its standards out there than making a fortune off of every example made. Blu-Ray _solely_ "won" because WalMart agreed to only carry Blu-Ray (probably for a bigger kickback). As the largest retailer of movies, that put the kibosh on HD-DVD. Quality wise, there was no tangible difference between Blu-Ray and HD-DVD.

As for the repeated "Why doesn't the US suck on the European Tit for standards?", it's simple. Most of the time the original commercial product was developed in the US, and the US already had something that worked (These were often started/invented in Europe, but Europeans can be insanely conservative for new developments - let someone else take the risk). Europe (or a country therein) would then skip the 10 year development/growing pains period, grab the end product, and build on what would seem like a logical set of standards.

It's not that easy to set aside the infrastructure that's already been paid for, in order to go to something new. If you've never had _anything_, it's easy to skip the steps. Look at all the countries that went basically from no telecommunications to cellular networks. They did it because it's much easier to build the towers than run telephone wiring everywhere. Other places with heavy fibre layouts? When they decided to wire places for the first time in 70 years, why _not_ just put in whatever the newest, most durable standard might be? (Yes, you can cut it, but fibre doesn't suffer from corrosion)

For the US, here's the _rough_ development of telephones.

Telegraph - developed in several places in Europe and America between the last 1700's and 1830's. However, due to issues, Europe didn't use it for much than short distances (other than England, who used it for the rails)

1837 - Samuel Morse and his assistant, Vail, developed Morse code and the telegraph system we think of today. By less than 10 years later, the US had linked up all of their major East Coast cities _and_ by 1861, had linked east to west coast. That's 24 years. In Europe, other than England, they hadn't even connected major cities. (England switched to Morse code by 1851)

Telex? Invented in Germany, but spread mostly in other countries. Undersea cables? Britain. What did France do? Developed a code that was used by everyone else for a while to speed up transmission.

Radio? _not_ Marconi. Tesla (American immigrant). Marconi just commercialized it, and tried to claim it as his own - Tesla won the court cases by proving that Marconi was present at the lectures Tesla gave on the theory.

Telephone? That's AT&T in the original American Telephone & Telegraph (aka Bell Telephone Company in its original form)

http://www.greatachievements.org/?id=3625 (calendar from 1900 on). Note that most of the major improvements are either US/Canada/Britain, or US only.

Early telephones were like the rails (developed predominantly in Britian) - interconnectivity and compatibility was like early cellular - they just couldn't work with each other. One town's telephone system might or might not be able to talk to the next town. Thus was born the monopoly - which during the development and spreading process was arguably more beneficial than detrimental. The downside was that they didn't see the point in replacing older stuff with newer stuff that quickly. It took 10 years+ for the touch tone phone to actually go from final product to release to the public. There are even still telephone Central Offices (buildings scattered around towns and cities that centralize areas) that are packed with rotary and panel switching equipment, rather than having been replaced with digital switches. (Weirdly enough, the "Rotary System" was used predominantly in Europe, and the "Panel System" was used predominantly in the US/Canada - but both were developed by AT&T subsidiaries. )

In any case, the US went from wired telephone with crank and switchboard, to rotary phones with switchboard and more automated systems, to touch tone phones, then the explosion of the digital phone era from deregulation, then the advent of analog cellular (anyone remember GTE Mobilenet?), pager systems, the switch to digital cellular (NexTel/iDEN, CDMA, TDMA, GSM), Europe (probably due to wars more than anything else) skipped big chunks of the development/infrastructure steps. For example, most of Europe has GSM systems - because it was mandated by law in 1987. CDMA was a Qualcomm development. Arguably, CDMA systems are often much more advanced, but GSM is cheaper and more widespread due to licensing costs (see the above about Beta and VHS)

Also, keep in mind the relative sizes. What's the joke? "In America, 100 years is a long time, in Europe, 100 miles is a long way" The US, as a single country, is about the size of Europe. (3.9 million square miles for Europe, 3.8 million square miles for the US). That means that developments in _one_ country can have what seems to Europeans to have an inordinate impact. Perhaps if Europeans thought of the US as being 50 countries in a Union, it might be easier for them to conceive of relative impact. The US "Countries" tend to do things together, thinking themselves part of a common group. The European "Countries" think of themselves as distinctly different entities, and most of them have some sort of long standing dislike for each other. ("You invaded my country four hundred years ago!" "Your country didn't _exist_ four hundred years ago!") (Germany, for example, is younger than the United States. It didn't exist until it was hammered together from the countries known as 'the Germanies' in 1871. Poland is probably the oldest European country that has a relatively unbroken governmental structure. England (not the UK!) also has a very old continuous line of governance)

Am I upset about it? Nope. Someone has to foot the bill for the development, and the US/Canada/Britian triumvirate has been the most able to bear the costs of that development. Various Europeans kicked off the start of a lot of it - they just couldn't get people to cough up to take it beyond that start.

US/Japan is similar. The US has developed a LOT of things, but frequently it was the Japanese that then took that item built for a single purpose, engineered it to be smaller, and pushed it out to the commercial market. No reason to get upset with Japan for making it smaller/faster/better. (Think about the old joke about Japanese Transistor Radios)


I'll get a life when it's proven and substantiated to be better than what I'm currently experiencing.

This is one of the few cases where marketing is not to blame.

SCART have just straight pin-to-pin wires. Only extremely expensive SCART cables have any kind of shielding and that had never being shown to improve anything at distances under 5 meters.
So on physical level SCART cable has no advantages over bunch of RCA cables.

SCART wasn't a thing in the

SCART wasn't a thing in the US? All I know is that it was the connection practically everything in England used before HD became a thing.

The reason is actually quite simple

SCART was used for the most part for component video (that's the reason for it's superior quality, as it provided three channels that only had to carry a gray-scale map representing a single RGB color as opposed to one for all of them). In the US they used a different connector ,or more precisely series of them, to pass component signal: The use used 3 RCA connectors, one per channel .

When comparing to the 3 RCA connectors it made little sense to migrate to SCART. It has a lot of redundancy ( pins that are used for only S-video or different pins for analog and digital audio ) so it was decide to use three RCA and an auxiliary audio port in everywhere but Europe ( where SCART was already so common that there was no migration overhead for the TV manufacturer).

In the end when the inhabitable move to digital signal had happen it made little difference , as HDMI/DVI/Display port are far superior in terms of reliability and bandwidth .

Hopes that helps , if you want to know something else just ask :)