I have an io field where I want to accept a number from 0 - 99,999. So I use the Ultra Numeric Editor and set the Min Value to 0, the Max Value to 99999. What's irritating me is when the field is displayed on the screen it has a Zero in it (which is correct) and when I tab into the field, the cursor is set in front of the 0. Consequently, if I enter a 1, the number becomes 10... I don't want that, I want it to change the value from 0 to 1. What can I set to have the Value automatically selected when I enter the field so that whatever I type OVERWRITES the value that was in there?
Why does the UltraMaskedEdit select the value when you tab into the field but the UltraNumericEditor doesn't? I would use the UltraMaskedEdit control as my Numeric IO field if I could set it to right justify the value and enter the value from the right side like a Numeric editor would, but unfortunately, I can't seem to make it do that... More irritation...
My second complaint on the Numeric Editor is that if I delete the 0 then try to exit the field, I get a beep and the cursor won't leave the field until I put in a 0 or some other value. (I realize I DO have the Nullable property set to False, which is what I want) How can I set a Default (like 0) value for the field?
Finally, I STILL don't get emailed replies to my posts, even tho I ask it to.
To what dropdown issue are you referring?
I answer about 20+ posts a day, so I don't remember.
Yehaw, we agree on one outta five... :)
On the one we agree on, can we add a case to fix the broken behavior of the dropdown control I documented a while back? SInce it could theoretically be a "breaking" change (even you agreed it wasn't very likely to cause a problem), you weren't willing to add it back when we were discussing it. But if we could add it as a Breaking Change Fix for 2009 vol 2, that would be nice.
Wolven said:A: It WOULD work if your controls were "smart" enough to not fire a Click event twice along with the DoubleClick event. (yeah, yeah, yeah, I know... that's how the infinitely wise MS does it)
This is not something that Infragistics invented. It's not even specific to DotNet - it's simply the way Windows works. When you click on something, the machine does not have any way of knowing that you intend to click a second time. It cannot see the future. So each click sends a click message.
I suppose Windows could simply buffer every click and not process it until the double-click time expired before deciding if a click is meant to be a single click or a double-click. But it seems to me this would make for a sluggish and annoying user experience.
Wolven said:B: The unlikely "breaking" scenario you gave would be due to some awfully sloppy programming on somebodies part. I'd suggest that that type of programming deserves to get broken.
I don't think it's unreasonble for a developer to assume that since an event only fires for certain tools that it will continue to do so in the future.
Wolven said:C: I can't tell you how nice it is to have my tool suppliers KNOW what's best for me and my applications. I wonder if that's how it works in every other industry... Do the tool makers tell Ford that there's no compelling case to build a vehicle the way Ford wants to do it? Do the subcomponent suppliers tell Boeing that there's no good reason they can see to build the part according to Boeings specs?
I think we here at Infragistics are very open to customer requests and new ideas. But there are certain limits and standards that the operating system and the environment enforce upon us as well as our customers and there are good reasons for that.It provides consistency across applications and development tools.
Wolven said:D: While I can appreciate and understand that this particular "feature request" may not rank up at the top of your importance list ... I don't appreciate being told that just because you don't see a need for it, there's no reason to do it. Just because that's not the way it's always been done, isn't a good reason for not trying something new. Innovation usually involves doing things a bit differently.
The fact that "that's not the way it's always been done" is irrelevant and I don't believe I made that suggestion.
My point here is that even if we did what you asked for and started firing the DoubleClick event for button tools, it won't help you acheive what you want here. So what would be the point?
Are you saying that we should add this feature simply because it was requested by a single customer - even though there is no practical use for it and it will take time away from the development of other, more useful, features?
Or are you suggesting that in spite of the fact that Windows fires 2 click events along with a DoubleClick that the toolbars manager should not do so? In order for that to happen, the toolbar would either have to somehow know that a second click is coming, or else delay the processing of the click event until the DoubleClick time (which is a system setting and can be changed) expired. Which means clicking on buttons would be slow an unresponsive.
Wolven said:E: Finally, I've thought about this "we can't change the broken behaviour because it might break someones application" problem I keep running into. It seems perfectly reasonable that when you come out with a new version (Like the upcoming Netadvantage 2009 vol 2), you could implement "breaking" style FIXES and changes as long as they were well documented in a "Breaking Changes" document. This would eliminate the never ending reason for not fixing broken code.
I agree. And we occassionally do this when it's neccessary.
Mike Saltzman"] Actually, it does. Suppose you have a very simple application with a few buttons and a TextBoxTool. You could handle the ToolDoubleClick and just respond to it under the assumption that only the TextBoxTool could possibly have triggered it. Now the DoubleClick event starts firing for button tools and your application is broken. I admit this seems unlikely, but we have to be very careful about making assumptions like this. If it breaks even one customer's application and doesn't add any value to the control, then it's really not worth it. If there was some useful purpose to it, I would agree with you. But so far, I haven't heard any compelling case where this would be at all useful. So it can potential break existing applications, and it adds nothing. It won't even work for the specific case you were trying to use it for.
Actually, it does. Suppose you have a very simple application with a few buttons and a TextBoxTool. You could handle the ToolDoubleClick and just respond to it under the assumption that only the TextBoxTool could possibly have triggered it. Now the DoubleClick event starts firing for button tools and your application is broken.
I admit this seems unlikely, but we have to be very careful about making assumptions like this. If it breaks even one customer's application and doesn't add any value to the control, then it's really not worth it.
If there was some useful purpose to it, I would agree with you. But so far, I haven't heard any compelling case where this would be at all useful. So it can potential break existing applications, and it adds nothing. It won't even work for the specific case you were trying to use it for.
A: It WOULD work if your controls were "smart" enough to not fire a Click event twice along with the DoubleClick event. (yeah, yeah, yeah, I know... that's how the infinitely wise MS does it)
B: The unlikely "breaking" scenario you gave would be due to some awfully sloppy programming on somebodies part. I'd suggest that that type of programming deserves to get broken.
C: I can't tell you how nice it is to have my tool suppliers KNOW what's best for me and my applications. I wonder if that's how it works in every other industry... Do the tool makers tell Ford that there's no compelling case to build a vehicle the way Ford wants to do it? Do the subcomponent suppliers tell Boeing that there's no good reason they can see to build the part according to Boeings specs?
D: While I can appreciate and understand that this particular "feature request" may not rank up at the top of your importance list ... I don't appreciate being told that just because you don't see a need for it, there's no reason to do it. Just because that's not the way it's always been done, isn't a good reason for not trying something new. Innovation usually involves doing things a bit differently.
E: Finally, I've thought about this "we can't change the broken behaviour because it might break someones application" problem I keep running into. It seems perfectly reasonable that when you come out with a new version (Like the upcoming Netadvantage 2009 vol 2), you could implement "breaking" style FIXES and changes as long as they were well documented in a "Breaking Changes" document. This would eliminate the never ending reason for not fixing broken code.
If a particular business doesn't want to deal with updating their software to handle the new PROPERLY functioning and enhanced controls, then they simply don't have to upgrage to the new version. They can stay on their current version for as long as they like. If I'm not mistaken, prior versions DO still get fix updates for some period of time after a new version comes out.
While backwards compatibility definitely has some benefits, when it starts impeding progress it's time to let it go.
I checked to see what the .NET Button control does, since that is the model on which a lot of our button behavior is based, and it does not fire the DoubleClick event either, even though it inherits said event from the base Control class. They also hide the DoubleClick event from the designer, presumably to discourage its use since it is not applicable.