Re: How android translate kernel input events to userland input events?

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Re: How android translate kernel input events to userland input events?

libretro
If you're going to remain this stubborn and refuse to accept the fact that serious developers don't NEED or WANT to go through bloated Java services just for the amusement of conforming to your 'framework', then at a certain point developers are just going to have enough and just leave for iOS and/or WinRT. It doesn't matter how much marketshare you have - a 'capital strike' by developers is still a tool we can wage to knock some sense back into this 'Android engineering team' that seems to lack the very basics of embedded sy systems engineering.

For the record - here are some more things you and your team are getting dead wrong -

1 - 100/150ms audio latency (seriously - 40ms is not even considered 'low latency' - 100/150ms is just an absolute disaster - do you realize that with latencies like this, you are the laughing stock of almost every other engineering team in the world right now? The PS3 audio server has 40ms latency, and that is considered 'bad'.)
2 - No exposure of SurfaceTexture (yes, real 'serious devs' need direct texture support to avoid having to do costly CPU to GPU blits - seriously - console SDKs offer this since Day One)
3 - No ability to control which services run through your runtime native app - like, I dunno, lots of Java services which do periodic checkups (worst offenders being Google Play Store and Gmail) and proceed to absolutely DESTROY your runtime performance - every time those services get invoked, the wonderful 'Dalvik' garbage collector does its handiwork and adds an additional 22ms delay. Seriously - how am I supposed to be creating a realtime app that runs at a steady 60fps if I can't 'control' which of these pre-installed and bloated services run right through my runtime lifecycle?

Is this 2012 or is this 1980? Because these are 'essential' things to get right from Day One. The fact that Android still doesn't have them is worse than aggravating - it's embarrassing for your company. Please do take note - developers will only have patience for so long until they start deserting this platform altogether. WinRT might give you serious competition from which you wont' be able to recover from if they play their cards right.

On Friday, October 23, 2009 8:54:20 PM UTC+2, Dianne Hackborn wrote:
It's not appropriate for android-developers or anywhere else either.  Please stop asking this question.  I already answered you: you can't do this.  The platform owns access to the input devices, and MUST be the one doing this so it can properly dispatch the events.  Most of this code is in Java, and everything involved with this in the application is in Java, there is simply NO native API.  If you want to have events in your native application, you need to receive them in your Java app and then hand them over to your native code using the normal JNI facilities.

On Thu, Oct 22, 2009 at 8:30 PM, Jack Palevich <<a href="javascript:" target="_blank" gdf-obfuscated-mailto="4YrjhCiOUZgJ">jac...@...> wrote:
I'm sorry, but this is not an appropriate topic for this group. The NDK does not support input events.

Please ask your question in a more appropriate group, such as android-developers or android-platform.



On Thu, Oct 22, 2009 at 8:19 PM, ZaichengQi <<a href="javascript:" target="_blank" gdf-obfuscated-mailto="4YrjhCiOUZgJ">vml...@...> wrote:

Hello,I‘m now learning how android handles linux input events from
kernel raw input events to the userland level KeyEvent and so on. And
I want to handle input events in native language. I've do some
experiments on handling events using NDK but the touch screen events
are very hard to handle(when I touch the screen, It generates a lot of
kernel events).

I've read the eventhub class in framework base dir in the android
source repo. And I now I know how android collects linux kernel raw
events using eventhub class by reading from /dev/input/event* , but I
still have no idea how it translates these raw events into the
userland logic input events like KeyEvent. The file keyinput service
only wrappers eventhub to JNI functions but there are no translation.

So please give me some hints on the kernel event and userland event
translation process.







--
Dianne Hackborn
Android framework engineer
<a href="javascript:" target="_blank" gdf-obfuscated-mailto="4YrjhCiOUZgJ">hac...@...

Note: please don't send private questions to me, as I don't have time to provide private support, and so won't reply to such e-mails.  All such questions should be posted on public forums, where I and others can see and answer them.

--
You received this message because you are subscribed to the Google Groups "android-ndk" group.
To view this discussion on the web visit https://groups.google.com/d/msg/android-ndk/-/iGQnyyogevcJ.
To post to this group, send email to [hidden email].
To unsubscribe from this group, send email to [hidden email].
For more options, visit this group at http://groups.google.com/group/android-ndk?hl=en.
Reply | Threaded
Open this post in threaded view
|

Re: How android translate kernel input events to userland input events?

jeff shanab
Amen. I cannot for the life of me see how android is gonna survive in the market unless a few critical things change. IOS has great integration between c++ and objectiveC, the windows phone is C++ moving to c# but pinvoke beats JNI hands down.  Java is an awsome servlet language running in a container on a server. It makes a good highlevel language for beginners. But it is a little too far from the system for a phone and being forced to go thru it reguardless of existing codebase and more advanced code is a game killer.

All those things aside, however, if they do not solve the awful development environment issues and horrific fragmentation issues, I worry that it may be doomed. If developers have a choice, they avoid it. That will trickle up if not fixed.

Great hardware, Great base system and initial idea, just gotta get back to development productivity.

On Sat, Dec 22, 2012 at 9:29 AM, <[hidden email]> wrote:
If you're going to remain this stubborn and refuse to accept the fact that serious developers don't NEED or WANT to go through bloated Java services just for the amusement of conforming to your 'framework', then at a certain point developers are just going to have enough and just leave for iOS and/or WinRT. It doesn't matter how much marketshare you have - a 'capital strike' by developers is still a tool we can wage to knock some sense back into this 'Android engineering team' that seems to lack the very basics of embedded sy systems engineering.

For the record - here are some more things you and your team are getting dead wrong -

1 - 100/150ms audio latency (seriously - 40ms is not even considered 'low latency' - 100/150ms is just an absolute disaster - do you realize that with latencies like this, you are the laughing stock of almost every other engineering team in the world right now? The PS3 audio server has 40ms latency, and that is considered 'bad'.)
2 - No exposure of SurfaceTexture (yes, real 'serious devs' need direct texture support to avoid having to do costly CPU to GPU blits - seriously - console SDKs offer this since Day One)
3 - No ability to control which services run through your runtime native app - like, I dunno, lots of Java services which do periodic checkups (worst offenders being Google Play Store and Gmail) and proceed to absolutely DESTROY your runtime performance - every time those services get invoked, the wonderful 'Dalvik' garbage collector does its handiwork and adds an additional 22ms delay. Seriously - how am I supposed to be creating a realtime app that runs at a steady 60fps if I can't 'control' which of these pre-installed and bloated services run right through my runtime lifecycle?

Is this 2012 or is this 1980? Because these are 'essential' things to get right from Day One. The fact that Android still doesn't have them is worse than aggravating - it's embarrassing for your company. Please do take note - developers will only have patience for so long until they start deserting this platform altogether. WinRT might give you serious competition from which you wont' be able to recover from if they play their cards right.

On Friday, October 23, 2009 8:54:20 PM UTC+2, Dianne Hackborn wrote:
It's not appropriate for android-developers or anywhere else either.  Please stop asking this question.  I already answered you: you can't do this.  The platform owns access to the input devices, and MUST be the one doing this so it can properly dispatch the events.  Most of this code is in Java, and everything involved with this in the application is in Java, there is simply NO native API.  If you want to have events in your native application, you need to receive them in your Java app and then hand them over to your native code using the normal JNI facilities.

On Thu, Oct 22, 2009 at 8:30 PM, Jack Palevich <[hidden email]> wrote:
I'm sorry, but this is not an appropriate topic for this group. The NDK does not support input events.

Please ask your question in a more appropriate group, such as android-developers or android-platform.



On Thu, Oct 22, 2009 at 8:19 PM, ZaichengQi <[hidden email]> wrote:

Hello,I‘m now learning how android handles linux input events from
kernel raw input events to the userland level KeyEvent and so on. And
I want to handle input events in native language. I've do some
experiments on handling events using NDK but the touch screen events
are very hard to handle(when I touch the screen, It generates a lot of
kernel events).

I've read the eventhub class in framework base dir in the android
source repo. And I now I know how android collects linux kernel raw
events using eventhub class by reading from /dev/input/event* , but I
still have no idea how it translates these raw events into the
userland logic input events like KeyEvent. The file keyinput service
only wrappers eventhub to JNI functions but there are no translation.

So please give me some hints on the kernel event and userland event
translation process.







--
Dianne Hackborn
Android framework engineer
[hidden email]

Note: please don't send private questions to me, as I don't have time to provide private support, and so won't reply to such e-mails.  All such questions should be posted on public forums, where I and others can see and answer them.

--
You received this message because you are subscribed to the Google Groups "android-ndk" group.
To view this discussion on the web visit https://groups.google.com/d/msg/android-ndk/-/iGQnyyogevcJ.
To post to this group, send email to [hidden email].
To unsubscribe from this group, send email to [hidden email].
For more options, visit this group at http://groups.google.com/group/android-ndk?hl=en.

--
You received this message because you are subscribed to the Google Groups "android-ndk" group.
To post to this group, send email to [hidden email].
To unsubscribe from this group, send email to [hidden email].
For more options, visit this group at http://groups.google.com/group/android-ndk?hl=en.
Reply | Threaded
Open this post in threaded view
|

Re: How android translate kernel input events to userland input events?

Dianne Hackborn
In reply to this post by libretro
Wow, where did this rant come from based on replying to an over three year old post?  Android 2.3 introduced a greatly expanded NDK, with native APIs for events, and dispatch to those APIs does not go through Java code.  Then you throw in two completely irrelevant issues to this discussion, and on the first one (audio latency) there has been much discussion on the issues around this such as: http://www.rossbencina.com/code/dave-sparks-on-android-audio-latency-at-google-io-2011 which was from over a year ago and you can easily find information on progress on that front such as http://createdigitalmusic.com/2012/07/android-high-performance-audio-in-4-1-and-what-it-means-plus-libpd-goodness-today/

Finally, your third point is just wrong.  There is not one Dalvik VM running everything, like you seem to think -- each app gets its own processes, running its own VM.  So if an app is doing something that causes it to need to do a GC, this won't block your own process's VM.  (And even if your own process does a GC, you can easily run native threads in that process that aren't attached to the VM and so won't get blocked by anything the VM does.)  These kinds of interactions between processes are governed almost entirely by the kernel's scheduler, and Android does a fair amount of stuff (such as its use of background vs. foreground cgroups) to get the scheduler to avoid letting work done by background apps impact the running of foreground apps.  In fact, one of the things that has been done for the improved audio latency is introducing a new scheduling group for low latency audio threads that further helps the kernel schedule them w.r.t. other threads in the system.

On Sat, Dec 22, 2012 at 7:29 AM, <[hidden email]> wrote:
If you're going to remain this stubborn and refuse to accept the fact that serious developers don't NEED or WANT to go through bloated Java services just for the amusement of conforming to your 'framework', then at a certain point developers are just going to have enough and just leave for iOS and/or WinRT. It doesn't matter how much marketshare you have - a 'capital strike' by developers is still a tool we can wage to knock some sense back into this 'Android engineering team' that seems to lack the very basics of embedded sy systems engineering.

For the record - here are some more things you and your team are getting dead wrong -

1 - 100/150ms audio latency (seriously - 40ms is not even considered 'low latency' - 100/150ms is just an absolute disaster - do you realize that with latencies like this, you are the laughing stock of almost every other engineering team in the world right now? The PS3 audio server has 40ms latency, and that is considered 'bad'.)
2 - No exposure of SurfaceTexture (yes, real 'serious devs' need direct texture support to avoid having to do costly CPU to GPU blits - seriously - console SDKs offer this since Day One)
3 - No ability to control which services run through your runtime native app - like, I dunno, lots of Java services which do periodic checkups (worst offenders being Google Play Store and Gmail) and proceed to absolutely DESTROY your runtime performance - every time those services get invoked, the wonderful 'Dalvik' garbage collector does its handiwork and adds an additional 22ms delay. Seriously - how am I supposed to be creating a realtime app that runs at a steady 60fps if I can't 'control' which of these pre-installed and bloated services run right through my runtime lifecycle?

Is this 2012 or is this 1980? Because these are 'essential' things to get right from Day One. The fact that Android still doesn't have them is worse than aggravating - it's embarrassing for your company. Please do take note - developers will only have patience for so long until they start deserting this platform altogether. WinRT might give you serious competition from which you wont' be able to recover from if they play their cards right.


On Friday, October 23, 2009 8:54:20 PM UTC+2, Dianne Hackborn wrote:
It's not appropriate for android-developers or anywhere else either.  Please stop asking this question.  I already answered you: you can't do this.  The platform owns access to the input devices, and MUST be the one doing this so it can properly dispatch the events.  Most of this code is in Java, and everything involved with this in the application is in Java, there is simply NO native API.  If you want to have events in your native application, you need to receive them in your Java app and then hand them over to your native code using the normal JNI facilities.

On Thu, Oct 22, 2009 at 8:30 PM, Jack Palevich <[hidden email]> wrote:
I'm sorry, but this is not an appropriate topic for this group. The NDK does not support input events.

Please ask your question in a more appropriate group, such as android-developers or android-platform.



On Thu, Oct 22, 2009 at 8:19 PM, ZaichengQi <[hidden email]> wrote:

Hello,I‘m now learning how android handles linux input events from
kernel raw input events to the userland level KeyEvent and so on. And
I want to handle input events in native language. I've do some
experiments on handling events using NDK but the touch screen events
are very hard to handle(when I touch the screen, It generates a lot of
kernel events).

I've read the eventhub class in framework base dir in the android
source repo. And I now I know how android collects linux kernel raw
events using eventhub class by reading from /dev/input/event* , but I
still have no idea how it translates these raw events into the
userland logic input events like KeyEvent. The file keyinput service
only wrappers eventhub to JNI functions but there are no translation.

So please give me some hints on the kernel event and userland event
translation process.







--
Dianne Hackborn
Android framework engineer
[hidden email]


Note: please don't send private questions to me, as I don't have time to provide private support, and so won't reply to such e-mails.  All such questions should be posted on public forums, where I and others can see and answer them.




--
Dianne Hackborn
Android framework engineer
[hidden email]

Note: please don't send private questions to me, as I don't have time to provide private support, and so won't reply to such e-mails.  All such questions should be posted on public forums, where I and others can see and answer them.

--
You received this message because you are subscribed to the Google Groups "android-ndk" group.
To post to this group, send email to [hidden email].
To unsubscribe from this group, send email to [hidden email].
For more options, visit this group at http://groups.google.com/group/android-ndk?hl=en.