i have a website that i built in two languages
including two different “dictionaries” based on browser language
in each langXX.php i have
define('SOME_STRING', 'Some string');
etc… but i hate now this solution because i’m never sure if i included a new definition in both files…so i’m trying to move including all the definition in mysql and build a function like
translate('SOME_STRING', 'en'); // ouput 'Some string'
i’m moving to mysql because id like to manage the translation directly via a CMS i will build, i dont want to have the webmaster going trough ftp and textmate…
so my question is, if this latter solution i sbetter or is too stressful for mysql to get a query for each text element of the page..
PS. as far as i remember even osCommerce used this practice right?
7
Answers
You’d be better off doing a single query to download the entire dictionary for the language you want, then looping through that every time you want to display something within the page.
Say your dictionary has 500 phrases in it, and you want to display 30 ‘elements’ on your page.
You’ll do a single query then 30 searches through the dictionary to find the element you want.
Much quicker than 30 separate mysql calls.
A simple method used to translate text elements of your website could be broken down into this simple three-table structure:
This way, a simple query can obtain the requested content in any language, automatically falling back to default language if there is no translation.
WordPress uses this: _e(“String in default language”);
If you specify a language via set_locale() or sth. like that it will search for an translation within a file (or in your case a database) and will output this translated text. If there isn´t any translation given it prints out the string “as it is”.
Think this variant has some advantages:
See https://codex.wordpress.org/I18n_for_WordPress_Developers for more information. Think this is a really good solution.
Why not use file caching of the data stored in the database? You remove a potentially costly, heavily repeated set of database calls, but can retain the advantages of CMS overview of language values.
I wouldn’t worry about speed and stress. Do what makes sense and then work around that to get your speed.
I have a similar setup using mysql to store translations. I decided to store the translations in mysql and retrieve them and modify them there on my administration interface. When any modifications are made to this table I re-write a local file with a translation array built from all the translations in my database. It is a text file of the php code needed to build the array that I can include appropriately.
The mysql combined with my CMS provide me with a nice way to enter translations and ensures they cover all the languages that I want.
You can use gettext or any of the internationalization API’s available in PHP.
I like gettext because it’s available in many languages (C/C++, PHP, Python…) with near the same API.
There’re two kind of situations for localitazion:
First one (we’re going to call “static”) refers to such content that’s part of user interface (for example, form labels, locale-specific icons… and so on).
Second one is such content contributed by the user which should be saved for some concrete locale/culture (call it “dynamic”).
Static would be cached. It could be stored in a database, XML files or any source and/or format. Since static content is changed few times, there’s no need to access to the localization store (database, XML or whatever): just put content in some cache and access to it.
Dynamic content wouldn’t be cached, or if it’s cached, it must be refreshed everytime some part or all content changes in the store.
So, at the end of the day, any content, even static or dynamic, will be served from a content cache in order to avoid database connections and queries, and, obviously, increase your web application performance.
Your second statement would be fine if it doesn’t query the database everytime some localized content needs to be loaded.