Jump to content

Localisation of page

New CLDR data is published roughly twice a year. We need to go with CLDRs work cycle.

After the publication of new data and the moment, their survey tool is opened again for new submissions, we can simply rely on the published data.

When the survey tool is open, we could use it to poll new suggested data, but

  1. it does not fit our data structure, since they do not alter or add autoritative data, but rather collect sets of new/additional suggestions per item. Which one would we anticipate to become final?
  2. I cannot recommend polling because of performance considerations on either side, and we would at least have to ask for permission, because we could degrade their (sometimes flaky) server performance.

When vetting is open, the situation is basically very similar.

After the vettig phase, until the new data is published, I think, we cannot get the new data, but it may still be possible to use the survey tool read only, while data should remain unaltered.

Conclusion: Unless we find a special arrangement with CLDR, we likely can see their data as static and stable from publication until the survey tool is reopened, and maybe even until it is closed. If we are able to to find an arrangement with them that allows us to bulk submit our new data as suggestions towards the end of the survey phase, we can then start our new collection phase right after the (last) submission.

Using a bot to suppy data via the survey tool is likely possible. I cannot estimate the labour needed to make such a bot, but one should not be hard to find with a little reseach. Whether or not CLDR would accept that. is unclear. Usually, accounts are given to individuals allowing them access to two locales, that is und (undefined) for trials and tests, and the "real" one they work upon. We would need to have a more universal account anyways. It may be easier to supply our data in a common exchange format such as CSV, XML (LDML) e.g. via e-mail or upload. I suggest to find someone from their staff, and talk about this.

Purodha Blissenbach13:37, 20 March 2011

Nike's original proposal included us localising language names here first and then transferring the whole database to CLDR. I think it might be more realistic and effective if we were to tackle the transfer to CLDR for a few locales, or just one, at a time. That way, we can prioritise the languages which most need this leg up, and which also have enough translators to complete the localisation.

Lloffiwr23:02, 25 March 2011

So, any volunteers to pick up the communication?

Nike16:19, 26 March 2011

I did, by e-mail, but no reply yet. I may have to poke several potential partners and shall do that one by one. May take a week or two.

Purodha Blissenbach11:50, 27 March 2011

Sorry, it looks like I didn't reply properly. Any ways, I just saw your e-mail (I was away from my mail over the weekend, after working very hard on preparing for the ST vetting phase to open). Please do not poke potential partners! It would be better to just file a single bug, but I'm glad to forward the e-mail to the TC if you prefer that.

Srl29515:24, 28 March 2011

Can you elaborate on what we (translatewiki.net) should do to work on this issue?

Nike11:34, 31 March 2011
 

I got an e-mail reply from Srl295 last monday who asked for permission to forward the note on to the CLDR technical committee, which I granted.

A rough sketch for the translatewiki.net technical part as I currently see it:

  • We could use locales provided in the Local Description Markup Language LDML, an XML format designed by CLDR, to extract a set of key => value pairs per locale, which obviously can be imported into its own namespace in translatewiki.net, possibly including limited amounts of message documentation.
  • The inverse process could be used to generate files with suggestions of additions and updates to be provided to CLDR.
  • Other types of export formats are possible of course as well.
  • While this scheme could transport all possible locale items, we do not have to support them all.
  • I can do the necessary coding.
Purodha Blissenbach17:33, 31 March 2011