This article is targeted to implement the Chinese Input Method (Pinyin, Phonetic, handwriting) on Android platform. Firstly, it briefly introduces the Android Input Method framework and key components; then the possible resources for reference are listed and analyses; And the draft solution for Chines IM are proposed and pre-developed on target platform, with FAQ & some comments from mailing list are appended!
Note: the handwriting is on-going; the uniform Chinese Input Interfaces are under definination.
1. InputMethod Framework and Key Componentsa).InputMethodService provides a standard implementation of an InputMethod(IME), which final impl can derive from and customize (AbstractInputMethodService and InputMethod).
InputMethodService provides
a basic framework for standard UI elements (input view, candidates view, fullscreen mode) but it is up to a
particular implementor to decide how to use them (impl an input area with keyboard, draw text, no input area but use text to speech conversion).
All the elements are placed together in a single window managed by InputMethodService. It provide callback to get information required and APIs for programmatic control over them.
3 kinds of views:SoftInputView:
Create an Instiance: onCreateInputView(); // after creating, call back on the InputMethodService to interact with the application as appropriate
Show or not on screen: onEvaluateInputViewShown()
updateInputViewShown() // re-evaluated
the default is shown; but in case there is a hard keyboard, it should be closed.
CandidatesView:
Create an Instiance: onCreateCandidatesView();
Show or not on screen: setCandidateViewShown(boolean)
FullScreenMode:
InputMethod UI is too large to integrate with the application UI, so InputMethod UI just take over the screen, i.e. input method window filling the entire screen and add its own
'extracted text" editor
showing the usre the text that is being typed. In this mode, there is an standard implementation of 'extracted text' editor, and no more change will be needed. The editor is at the top of IME, above the input and candidates views.
Show or not on screen; onEvaluateFullScreenMode() // enter into this mode when the screen is in a landscape orientation.
Some special in FullScreenMode: onDisplayCompletions() to show completions generated by apps.
text generated to the apps.
Path: via androidview.inputmethod.InputConnection() interface (getCurrentInputConnection());
InputType (password, phone number...): android.view.inputmethod.EditorInfo.inputType
Switch between input targets: onFinishInput() and onStartInput(EditorInfo, boolean)
b). InputMethodManager proviide central APIs to the IMF arch, which arbitrates interaction app. and current IM.
context.getSystemService(String) to retrieve it.
InputMethodManagerService:(extends InputMethodManager.stub) provides a system service that manage input methods;
IMM: client-side APIs exists in each app context and communicates with a global system service that manage the interaction across all processes.
IME: impl a particular interaction model allowing the user to generate text. The system binds to the current Input Method and causing it the be created an run, and tells it when to hide and show its UI. Only one IME is running at a time.
Client Apps: arbitrate with IMM for input focus and control over the state of IME.
IME is implemented as a android.app.service, typically deriving from the core interface (InputMethod). Implementors will only need to deal with the higher-level APIs there, IME is normally handled by InputMethodService.
A Client app can ask that the system let the user pick a new IME but can't programmatically switch to one itself to avoid malicious apps.
The user must explicitly enable a new IME in
settings before they can switch to it.
c). InputMethod Interfacerepresents an input method which can generate key events and text (digital, email address, CJK...) while handling various inut events and send the text back to the apps. Apps don't directly use this interface but via android.widget.TextView/EditText.
Generally, IM implementation should be derived from
InputMethodService or its subclasses. When implementing an IM, the service component containing it must also supply a
SERVICE_META_DATA meta-data field, referencing an XML resource providing details about the IM. All IM also MUST require that clients hold the
android.Manifest.permission#BIND_INPUT_METHOD in order to interact with the service; otherwise, the system won't use that IM because it can't trust it.
InputMethod Interface is actually splity into 2 parts:
top-level interface to the IM, providing all access to it (only system can access due to BIND_INPUT_METHOD permission requirements);
its
methodcreateSession(android.view.inputmethod.InputMethod.SessionCallback) can be called to instantate a secondary
Interface (InputMethodSession) which is what clients use to communite with the InputMethod.
d). InputMethodInfoit is used to specify the meta information of an InputMethod.
SERVICE_META_DATA mentioned in
c)InputMethod Interface is used in the InputMethodInfo constructor.
e). InputMethodSettingsDisplays preferences for input methods
f). System Service Registerationframeworks/base/core/java/android/content/context.java
g). keyboard : loads an XML description of a keyboard and stores the attributes of the keys.
h). KeyboardView: a view that renders a virtual keyboard. It handles rendering of keys and detecting key presses and touch movements.
In a summary, the IMF looks like below: apps.
widget.textview/editview
^
|
| (InputConnection) |
| |
IME:InputMethodService | | Class: InputMethodManager
3 kinds of views |
InputMethodSession |System Service:
( created in Interface InputMethod) |
InputMethodManagerService InputMethod Interface |(arbitrate the interaction
((un)bindinput, startinput, stopinput, Session APIs) | between IME and apps.)
|
|
Implementations of IME are derived from IMService |
IME1 IME2 .......... |
|
The last but not less important, Softkeyboard package located in development/sample/SoftKeyboard is a reference implementation of IME for softkeyboard, which can be verified via typing 'ime list' on the target console.
2. Implementation on Android2.1 PinYin Input and Phonetic InputIn desktop, there are several matual solution/implementaion for reference, like framework SCIM, phonetic reference KerKerInput ().
Even so, when we are back to our task, it should be easy to integrate the PinYin and Phonetic input based on the SoftKeyboard implementation. What will be done includes the following:
- Define a keyboard layout (pinyin is the same as alphabeta, phonetic follows the layout for Taiwai's)
- Create the related keyboard instiance
- Highjack the characters inputed and do the translation to Chinese Characters (SQlite is introduced to store the Phonetic table)
- Modified the candidatesView so that user can select the expected character
Note: Prediction is not supported yet.
2.2 Handwriting InputThis is a rather complicated topic and task. The difference from Pinyin and Phonetic lies on:
- user input is graphics like bitmap,... not characters
- Handwriting engine are native libraries not Java API or Java libraries
- User experience
After the thorough analysis on the IMF and softkeyboard implementation, the possible approach is raised:
- take softkeyboard as example, handwriting is categorized to IME and Softkeybard is also an IME, it should be possible to reuse many of the code;
- handwriting works in FullScreenMode (IME support 3 kinds of view);
- Modification on acquire input and dispatch out: As memtioned in 1.a), input method in fullscreen use its own "extracted text' editor showing the user the text typed and this editor don't need any change, which is at the top of IME, above Input view and candidates view. Here the 'extracted text' editor should be the target apps to display the final characters input, therefore, the calling to handwriting engine and character translation should be embedded here.
Issue pending:a) Access native libraries in Android. JNI, socket, ...but not one resolved yet
Here (http://blog.chinaunix.net/u2/87831/showart.php?id=1838656) is a reference code on my blog, which is based on MediaPlayer.
Some people also take a simple 'hello' as example and successed, what about on a relative complicated condition?
In addition, their is also a hot discussion on 'native code access/support' on mailing list. take a look at the mail from Android-Dev-Groups between an Android engineer and LG engineer but seem that no progress so far; It's said by a Android engineer on Feb. 2009 that 'native code access' is not supported in the current SDK (http://groups.google.com/group/android-developers/browse_thread/thread/d68364976e5d98ff/733eea4a1195527e?lnk=gst&q=native+support#733eea4a1195527ewq).
At last, through experiment, access native code via JNI is OK. ===============Appendix:=======================A: Comments on IMF from mailing list:There are a few important things for app developers to know about
interacting well with the new input method system, which we will be
talking about more as the branch stabilizes and is ready for use as an
SDK:
- There is a new android:inputType
attribute and a setInputType() method on TextView for controlling how
your text should be managed. These replace the android:password,
android:singleLine, android:numeric,
android:phoneNumber,android:inputMethod, android:capitalize,
android:autotext, android:editable attributes and let you specify
additional details about your text. People developing against cupcake
should use android:inputType in all new code. The framework does
interpret the old attributes into a new style input type, but where you
can it is good to update to android:inputType so you can supply the
additional information.
- There is a new "softInputMode" you can specify for a window, to control whether the soft keyboard is displayed automatically
when your window is shown and whether your window is panned or resized
when it is shown. You can specify this either with
Window.setSoftInputMode(), in a custom Theme, or with a new
android:windowSoftInputMode attribute on an activity in its manifest.
In general I think the system does a decent job of deciding what to do
with windows automatically, but there will certainly be cases where you
want to specify this yourself, especially to have the soft keyboard
displayed automatically.
-
The way a user gets to a soft keyboard is by pressing on an editable
text view. This means that applications must not implement their own
behavior for tapping on it.
- The new InputMethodManager class (which get be retrieved by getSystemService()) provides programmatic control of the soft keyboard.
B: FAQ:
a).W/ResourceType( 6315): Unable to get buffer of resource asset file
The ADT's PreCompilerBuilder uses aapt to generate the R.java file. The android.jar includes a compressed resources file called resources.arsc that appt needs to extract out to a buffer. The first error indicates that it failed to create this buffer.
The root cause is that the buffer max size is set to 1 MB. The resources.arsc
file in prior versions was smaller than the max size but in cupcake,
it's about 1.3 MB. For a quick workaround, you can increase this buffer
size to 2 MB. To do this, look for the Asset.h file under
mydroid/frameworks/base/include/utils
change UNCOMPRESS_DATA_MAX = 1 * 1024 * 1024
to UNCOMPRESS_DATA_MAX = 2 * 1024 * 1024
and rebuild the sdk again.
b). ERROR Error: This
attribute must be localized. ( 'text' with value '').
adding the localized text in app_pkg/res/values/strings.xml;
in code, the widgetID are called via mStatusText = (TextView)findViewById(R.id.statustext)
====== Not End!!! The solution will be updated continously =======