2008-11-25 09:13:26 8 Comments
I'm developing a multilanguage software. As far as the application code goes, localizability is not an issue. We can use language specific resources and have all kinds of tools that work well with them.
But what is the best approach in defining a multilanguage database schema? Let's say we have a lot of tables (100 or more), and each table can have multiple columns that can be localized (most of nvarchar columns should be localizable). For instance one of the tables might hold product information:
CREATE TABLE T_PRODUCT (
NAME NVARCHAR(50),
DESCRIPTION NTEXT,
PRICE NUMBER(18, 2)
)
I can think of three approaches to support multilingual text in NAME and DESCRIPTION columns:
Separate column for each language
When we add a new language to the system, we must create additional columns to store the translated text, like this:
CREATE TABLE T_PRODUCT ( NAME_EN NVARCHAR(50), NAME_DE NVARCHAR(50), NAME_SP NVARCHAR(50), DESCRIPTION_EN NTEXT, DESCRIPTION_DE NTEXT, DESCRIPTION_SP NTEXT, PRICE NUMBER(18,2) )
Translation table with columns for each language
Instead of storing translated text, only a foreign key to the translations table is stored. The translations table contains a column for each language.
CREATE TABLE T_PRODUCT ( NAME_FK int, DESCRIPTION_FK int, PRICE NUMBER(18, 2) ) CREATE TABLE T_TRANSLATION ( TRANSLATION_ID, TEXT_EN NTEXT, TEXT_DE NTEXT, TEXT_SP NTEXT )
Translation tables with rows for each language
Instead of storing translated text, only a foreign key to the translations table is stored. The translations table contains only a key, and a separate table contains a row for each translation to a language.
CREATE TABLE T_PRODUCT ( NAME_FK int, DESCRIPTION_FK int, PRICE NUMBER(18, 2) ) CREATE TABLE T_TRANSLATION ( TRANSLATION_ID ) CREATE TABLE T_TRANSLATION_ENTRY ( TRANSLATION_FK, LANGUAGE_FK, TRANSLATED_TEXT NTEXT ) CREATE TABLE T_TRANSLATION_LANGUAGE ( LANGUAGE_ID, LANGUAGE_CODE CHAR(2) )
There are pros and cons to each solution, and I would like to know what are your experiences with these approaches, what do you recommend and how would you go about designing a multilanguage database schema.
Related Questions
Sponsored Content
1 Answered Questions
[SOLVED] Database - Multilanguage online dictionary
- 2019-02-17 17:52:00
- Valentin Emil Cudelcu
- 36 View
- 0 Score
- 1 Answer
- Tags: database multilingual
5 Answered Questions
[SOLVED] Database design for audit logging
- 2010-01-06 18:27:40
- jbochi
- 75471 View
- 143 Score
- 5 Answer
- Tags: database-design logging audit
3 Answered Questions
[SOLVED] Android App multilanguage Content
- 2015-07-12 09:51:22
- droidfish
- 440 View
- 3 Score
- 3 Answer
- Tags: android sqlite multilingual
3 Answered Questions
Database for multilanguage design. double unique key referece to foreign key
- 2014-09-18 12:37:56
- PRO_gramista
- 121 View
- 1 Score
- 3 Answer
- Tags: sql database-design localization foreign-keys unique-constraint
1 Answered Questions
[SOLVED] Use existing MySQL database in CodeIgniter
- 2012-09-06 09:46:54
- Radenko Kosic
- 436 View
- 1 Score
- 1 Answer
- Tags: mysql codeigniter localization multilingual
1 Answered Questions
[SOLVED] How to use multilanguage database schema with ORM?
- 2010-06-19 21:29:25
- artvolk
- 9878 View
- 13 Score
- 1 Answer
- Tags: database orm multilingual
1 Answered Questions
Database structure to manage dynamic language data
- 2012-10-30 08:30:40
- Leto
- 453 View
- 0 Score
- 1 Answer
- Tags: multilingual
10 comments
@Stefan Steiger 2014-09-26 09:28:08
This is an interesting issue, so let's necromance.
Let's start by the problems of method 1:
Problem: You're denormalizing to save speed.
In SQL (except PostGreSQL with hstore), you can't pass a parameter language, and say:
So you have to do this:
Which means you have to alter ALL your queries if you add a new language. This naturally leads to using "dynamic SQL", so you don't have to alter all your queries.
This usually results in something like this (and it can't be used in views or table-valued functions by the way, which really is a problem if you actually need to filter the reporting date)
The problem with this is
a) Date-formatting is very language-specific, so you get a problem there, if you don't input in ISO format (which the average garden-variety programmer usually doesn't do, and in case of a report the user sure as hell won't do for you, even if explicitly instructed to do so).
and
b) most significantly, you loose any kind of syntax checking. If
<insert name of your "favourite" person here>
alters the schema because suddenly the requirements for wing change, and a a new table is created, the old one left but the reference field renamed, you don't get any kind of warning. A report even works when you run it without selecting the wing parameter (==> guid.empty). But suddenly, when an actual user actually selects a wing ==> boom. This method completely breakes any kind of testing.Method 2:
In a nutshell: "Great" idea (warning - sarcasm), let's combine the disadvantages of method 3 (slow speed when many entries) with the rather horrible disadvantages of method 1.
The only advantage of this method is that you keep all translation in one table, and therefore make maintenance simple. However, the same thing can be achieved with method 1 and a dynamic SQL stored procedure, and a (possibly temporary) table containing the translations, and the name of the target table (and is quite simple assuming you named all your text-fields the same).
Method 3:
One table for all translations: Disadvantage: You have to store n Foreign Keys in the products table for n fields you want to translate. Therefore, you have to do n joins for n fields. When the translation table is global, it has many entries, and joins become slow. Also, you always have to join the T_TRANSLATION table n times for n fields. This is quite an overhead. Now, what do you do when you must accommodate custom translations per customer ? You'll have to add another 2x n joins onto an additional table. If you have to join , say 10 tables, with 2x2xn = 4n additional joins, what a mess ! Also, this design makes it possible to use the same translation with 2 tables. If I change the item name in one table, do I really want to change an entry in another table as well EVERY SINGLE TIME ?
Plus you can't delete and re-insert the table anymore, because there are now foreign keys IN THE PRODUCT TABLE(s)... you can of course omit setting the FKs, and then
<insert name of your "favourite" person here>
can delete the table, and re-insert all entries with newid() [or by specifying the id in the insert, but having identity-insert OFF], and that would (and will) lead to data-garbage (and null-reference exceptions) really soon.Method 4 (not listed): Storing all the languages in a XML field in the database. e.g
Then you can get the value by XPath-Query in SQL, where you can put the string-variable in
And you can update the value like this:
Where you can replace
/lang/de/...
with'.../' + @in_language + '/...'
Kind of like the PostGre hstore, except that due to the overhead of parsing XML (instead of reading an entry from an associative array in PG hstore) it becomes far too slow plus the xml encoding makes it too painful to be useful.
Method 5 (as recommended by SunWuKung, the one you should choose): One translation table for each "Product" table. That means one row per language, and several "text" fields, so it requires only ONE (left) join on N fields. Then you can easily add a default-field in the "Product"-table, you can easily delete and re-insert the translation table, and you can create a second table for custom-translations (on demand), which you can also delete and re-insert), and you still have all the foreign keys.
Let's make an example to see this WORKS:
First, create the tables:
Then fill in the data
And then query the data:
If you're lazy, then you can also use the ISO-TwoLetterName ('DE', 'EN', etc.) as primary-key of the language table, then you don't have to lookup the language id. But if you do so, you maybe want to use the IETF-language tag instead, which is better, because you get de-CH and de-DE, which is really not the same ortography-wise (double s instead of ß everywhere), although it's the same base-language. That just as a tiny little detail that may be important to you, especially considering that en-US and en-GB/en-CA/en-AU or fr-FR/fr-CA has similar issues.
Quote: we don't need it, we only do our software in English.
Answer: Yes - but which one ??
Anyway, if you use an integer ID, you're flexible, and can change your method at any later time.
And you should use that integer, because there's nothing more annoying, destructive and troublesome than a botched Db design.
See also RFC 5646, ISO 639-2,
And, if you're still saying "we" only make our application for "only one culture" (like en-US usually)- therefore I don't need that extra integer, this would be a good time and place to mention the IANA language tags, wouldn't it ?
Because they go like this:
and
(there was an orthography reform in 1996...) Try finding a word in a dictionary if it is misspelled; this becomes very important in applications dealing with legal and public service portals.
More importantly, there are regions that are changing from cyrillic to latin alphabets, which may just be more troublesome than the superficial nuisance of some obscure orthography reform, which is why this might be an important consideration too, depending on which country you live in. One way or the other, it's better to have that integer in there, just in case...
Edit:
And by adding
ON DELETE CASCADE
afteryou can simply say:
DELETE FROM T_Products
, and get no foreign key violation.As for collation, I'd do it like this:
A) Have your own DAL
B) Save the desired collation name in the language table
You might want to put the collations in their own table, e.g.:
C) Have the collation name available in your auth.user.language information
D) Write your SQL like this:
E) Then, you can do this in your DAL:
Which will then give you this perfectly composed SQL-Query
@Eugene Evdokimov 2015-08-31 12:06:05
Good detailed response, many thanks. But what do you think about the collation issues in the Method 5 solution. It seems this is not the best way when you needed to sort or to filter the translated text in the multilingual environment with different collations. And in such case the Method 2 (which you "ostracized" so quickly :) ) could be a better option with slight modifications indicating target collation for each localized column.
@Stefan Steiger 2015-09-03 08:11:57
@Eugene Evdokimov: Yes, but "ORDER BY" is always going to be a problem, because you can't specify it as a variable. My approach would be to save the collation name in the language table, and have this in the userinfo. Then, on each SQL-Statement you can say ORDER BY COLUMN_NAME {#collation}, and then you can do a replace in your dal (cmd.CommandText = cmd.CommandText.Replace("{#COLLATION}", auth.user.language.collation) . Alternatively, you can sort in your application code, e.g. using LINQ. This would also take some processing load off your database. For reports, the report sorts anyway.
@Domino 2015-10-07 14:46:36
o.o This must be the longest SO answer I've seen, and I saw people make whole programs in answers. You're good.
@Domi 2018-08-14 08:00:39
Can totally agree SunWuKung's solution is the best
@bamburik 2013-08-12 07:22:08
Take a look for this example:
I think there's no need to explain, the structure describes itself.
@Illuminati 2015-12-09 23:21:03
this is good. but how would you search ( for example product_name ) ?
@David Létourneau 2016-01-11 14:11:31
Did you have a live example somewhere of your sample ? Did you get any problems by using it ?
@bamburik 2016-01-14 10:02:26
Sure, I have multilingual real estate project, we support 4 languages. The search is a bit complicated, but its fast. Of course in large projects it might be slower than it needs to be. In small or medium projects its ok.
@studyzy 2013-04-03 05:34:02
"Which one is best" is based on the project situation. The first one is easy to select and maintain, and also the performance is best since it don't need to join tables when select entity. If you confirmed that your poject is just only support 2 or 3 languages, and it will not increase, you can use it.
The second one is okey but is hard to understand and maintain. And the performance is worse than first one.
The last one is good at scalability but bad at performance. The T_TRANSLATION_ENTRY table will become larger and larger, it's terrible when you want to retrieve a list of entities from some tables.
@davey 2012-12-13 16:03:31
Would the below approach be viable? Say you have tables where more than 1 column needs translating. So for product you could have both product name & product description that need translating. Could you do the following:
@Bart VW 2012-08-21 20:35:08
I agree with randomizer. I don't see why you need a table "translation".
I think, this is enough:
@randomizer 2012-08-06 18:00:39
I was looking for some tips for localization and found this topic. I was wondering why this is used:
So you get something like user39603 suggests:
Can't you just leave the table Translation out so you get this:
@DanMan 2014-12-14 11:44:32
Sure. I'd call the
ProductItem
table something likeProductTexts
orProductL10n
though. Makes more sense.@SunWuKung 2008-11-27 10:02:17
What do you think about having a related translation table for each translatable table?
This way if you have multiple translatable column it would only require a single join to get it + since you are not autogenerating a translationid it may be easier to import items together with their related translations.
The negative side of this is that if you have a complex language fallback mechanism you may need to implement that for each translation table - if you are relying on some stored procedure to do that. If you do that from the app this will probably not be a problem.
Let me know what you think - I am also about to make a decision on this for our next application. So far we have used your 3rd type.
@qbeuek 2008-11-27 10:22:46
This option is similar to my option nr 1 but better. It is still hard to maintain and requires creating new tables for new languages, so I'd be reluctant to implement it.
@SunWuKung 2008-11-27 12:37:27
it doesn't require a new table for a new language - you simply add a new row to the appropriate _tr table with your new language, you only need to create a new _tr table if you create a new translatable table
@GorillaApe 2012-05-01 10:08:31
i beleive that this is a good method. other methods require tons of left joins and when you are joining multiple tables that each of them have translation like 3 level deep, and each one has 3 fields you need 3*3 9 left joins only for translations.. other wise 3. Also it is easier to add constraints etc and i beleive searching is more resonable.
@Mithril 2014-02-14 03:22:33
When
T_PRODUCT
has 1 million rows,T_PRODUCT_tr
would have 2 million.Would it reduce sql efficiency much?@David D 2014-09-08 11:50:42
@Mithril Either way you have 2 million rows. At least you don't need joins with this method.
@Adam Davis 2008-11-25 09:32:42
The third option is the best, for a few reasons:
-Adam
@Neil Barnwell 2008-12-03 11:58:34
I agree, though personally I'd have a localised table for each main table, to allow foreign keys to be implemented.
@rics 2009-10-14 07:40:24
Although the third option is the most clean and sound implementation of the problem it is more complex then first one. I think displaying, editing, reporting the general version needs so much extra effort that it does not always acceptable. I have implemented both solutions, the simpler was enough when the users needed a read-only (sometimes missing) translation of the "main" application language.
@Frosty Z 2011-01-20 10:08:21
What if the product table contains several translated fields ? When retrieving products, you will have to do one additional join per translated field, which will result in severe performance issues. There is as well (IMO) additional complexity for insert/update/delete. The single advantage of this is the lower number of tables. I would go for the method proposed by SunWuKung : I think it's a good balance between performance, complexity, and maintenance issues.
@saber 2011-09-08 04:50:55
@rics- I agree, well what do you suggest to ... ?
@saber 2011-09-08 04:57:53
@Adam- I am confused, maybe I misunderstood. You suggested the third one, right? Please explain it in more detail how are relations between those tables gonna be ? You mean we have to implement Translation and TranslationEntry tables for each tables in DB ?
@Adam Davis 2011-09-08 12:05:00
@s.amani No, all the tables in the database refer to one set of translation tables. So for the entire database you'll only have one translation and one translationentry table. The other tables will have keys that link into the translation tables to get the correct entries.
@saber 2011-09-08 13:09:03
@Adam- How is it possible? You mean each column in table will have one entry in TranslationEntry table right? Have you think about the records of this table?
@Adam Davis 2011-09-13 18:23:58
@s.amani If my explanation is not sufficient, and the original question isn't enough, I suggest you submit a new question asking for a simpler explanation of the third option. Comments are not a good way to give you the information you need.
@Duc Tran 2012-11-16 14:03:57
can anyone provide a clear example of the 3rd option? I think it seems suitable but I don't really get it
@Duc Tran 2012-11-16 16:21:35
@AdamDavis also if I only intend to provide localization for menus and messages all over my website, should I proceed with this approach? Is that too complicated for my purpose?
@Timo Huovinen 2013-11-02 10:05:43
How would a
SELECT
for this option look like?@FooBar 2014-01-20 15:35:50
I am really in doubt wether to use this solution or the one bamburik provided (see: stackoverflow.com/a/18181495). Any pros / cons? @AdamDavis
@ClearCloud8 2014-05-19 15:32:15
The third solution seems good to me as well, however..... doesn't this mean you will have 100 foreign key relationships from all the various 100 tables connecting to the one Translation table? (Actually more than 100, if each table might have more than one column needing translation.) I am curious if this would be a problem. The only downside I can think of is it would make a DB Diagram look like a spider web.
@Aleris 2008-11-25 09:59:32
Before going to technical details and solutions, you should stop for a minute and ask a few questions about the requirements. The answers can have a huge impact on the technical solution. Examples of such questions would be:
- Will all languages be used all the time?
- Who and when will fill the columns with the different language versions?
- What happens when a user will need a certain language of a text and there is none in the system?
- Only the texts are to be localized or there are also other items (for example PRICE can be stored in $ and € because they might be different)
@qbeuek 2008-11-25 11:36:23
I know that localization is a much broader topic and I am aware of the issues that you bring to my attention, but currently I am looking for an answer for a very specific problem of schema design. I assume that new languages will be added incrementally and each will be translated almost completely.
@user39603 2008-11-25 09:37:47
I usually would go for this approach (not actual sql), this corresponds with your last option.
Because having all translatable texts in one place makes maintenance so much easier. Sometimes translations are outsourced to translation bureaus, this way you can send them just one big export file, and import it back just as easily.
@DanMan 2014-12-14 11:41:05
What purpose does the
Translation
table or theTranslationItem.translationitemid
column serve?