How to overlay two Xamarin Forms layouts so that they can receive touch inputs?

(I posted this first on the Xamarin forums, but then decided that I could get a faster answer here?) TL; DR: Some layouts count cranes on a transparent background, others don't. Setting InputTransparent in the container sets it for all its children, and I feel that children should be able to override the parent. I need to create elements that overlap another element and pass taps through the transparent area, but still have buttons with the ability to click. When I try to do this using the Grid, this will not work. I do not want to return to AbsoluteLayouts. I mostly work in iOS right now, I'm not quite sure if this is a problem in Android. Windows Phone / UWP is not on the table.

Longer version:

I am rewriting some layouts that worked well in older Xamarin Forms (1.3, I think). We recently upgraded to 2.1, and this led to chaos in the wrong solution to the layout code. I have been tasked with updating layouts to behave. While I acknowledge that version 2.2 is released, I just tried updating, and everything I type seems to be true in this version too, so this is not a 2.1 or 2.2 problem, or at least if some improvements are made they do not help me.

This is a display application, so the central element of all layouts is the expensive, temperamental OpenGL element. This element is very unlike repair, so I used a layout similar to this imaginary XAML:

<ContentPage> <CustomLayout> <OurHeaderControl /> <TheMapControl /> <OurFooterControl /> <MapOverlay /> </CustomLayout> </ContentPage 

The purpose of MapOverlay is to implement our workflows by adding Xamarin elements on top of the header / footer and / or map areas. For example, one layout adds a list of directions to the bottom of the footer, so it has less space to display the map. The custom layout understands this and displays the map after overlay, so it can ask for the correct map borders.

In this layout, I can’t take advantage of anything that ended MapOverlay. I can do this InputTransparent and click on these things, but then all his children are also not displayable. This was not true in old layouts.

Here are the only differences that I see between the old layouts and the new:

Older layouts were a complex mess of AbsoluteLayouts. It looked something like this, I did not write:

 <ContentPage> <AbsoluteLayout> // "main layout" <AbsoluteLayout> // "map layout" <Map /> // An AbsoluteLayout containing the OpenGL view. </AbsoluteLayout> <AbsoluteLayout> // "child layout" <SubPage /> // An AbsoluteLayout </AbsoluteLayout> </AbsoluteLayout> </ContentPage> 

The main layout contains AbsoluteLayouts to restrict child views. One child view in itself is AbsoluteLayout, which contains a map and several other related elements. Another child is the overlay, which is always an AbsoluteLayout, which contains the elements related to this overlay. These layouts all bind to each other in a loop and update each other as layout events change. This is an exciting ping pong that eventually settles. Usually. Sometimes things just disappear. Obviously, there is a reason I am rewriting it.

But I can click on what I need, click on each layer, and I don’t understand this.

So let me talk about what I need to work, and maybe find out if this is a mistake, why it doesn’t work, or if it coincides with other layouts. Here's a non-XAML page layout, my project got its roots in the days when you couldn't use XAML in shared libraries:

I need to be able to use both buttons in this user interface and respond to them.

 public class MyPage : ContentPage { public MyPage() { var mainLayout = new AbsoluteLayout(); // Two buttons will be overlaid. var overlaidButton = new Button() { Text = "Overlaid", Command = new Command((o) => DisplayAlert("Upper!", "Overlaid button.", "Ah.")) }; mainLayout.Children.Add(overlaidButton, new Rectangle(0.25, 0.25, AbsoluteLayout.AutoSize, AbsoluteLayout.AutoSize), AbsoluteLayoutFlags.PositionProportional); // The top overlay layout will be a grid. var overlay = new Grid() { RowDefinitions = { new RowDefinition() { Height = new GridLength(1.0, GridUnitType.Star) } }, ColumnDefinitions = { new ColumnDefinition() { Width = new GridLength(1.0, GridUnitType.Star) }, new ColumnDefinition() { Width = new GridLength(1.0, GridUnitType.Star) }, }, BackgroundColor = Color.Transparent }; var overlayingButton = new Button() { Text = "Overlaying", Command = new Command((o) => DisplayAlert("Upper!", "Overlaying button.", "Ah.")) }; overlay.Children.Add(overlayingButton, 0, 1); mainLayout.Children.Add(overlay, new Rectangle(0, 0, 1.0, 1.0), AbsoluteLayoutFlags.All); // This pair of property sets makes the overlaid button clickable, but not the overlaying! // overlay.InputTransparent = true; // upperOverlayButton.InputTransparent = false; Content = mainLayout; } } 

This allows me to press the overlay button even when the grid changes to AbsoluteLayout.

I'm at a dead end. It took me 2 weeks to debug the original layouts and come up with a new solution. I really don't want to parse all our layouts and put everything in one big AbsoluteLayout or my own layout. There were two types of transparency in WPF: "transparent background" means that the background can still get into the test, and "zero background" means that the background did not hit the test. Is there a way to overlay layouts in Xamarin like this?

Or, more appropriately, why does the intricate nest of numerous AbsoluteLayouts in our old layouts work the way I need, but this much simpler layout is not?

Updates

Here is some additional information that I remember:

+5
source share
1 answer

In general, it seems that the behavior with iOS is regarding how InputTransparent handled in the Grid compared to the other two platforms. I am not particularly sure whether I will quantize the current behavior as an error at this time, but I understand that this frustrates the encounter with the uneven behavior of the platform.

However, there is some difficulty for your situation, if I understand correctly. It looks like a similar report has been filed before, and iOS behavior has been mentioned through this SO link . The question is asked in the non-Forms iOS application area, but logic can be applied here.

Using your own renderer (use the CustomGrid example as an example), you can specifically implement the iOS Grid implementation to implement the above method of referencing the main views:

CustomGrid.cs (PCL):

 public class CustomGrid : Grid { public CustomGrid() { } } 

CustomGrid.cs (iOS):

 [assembly: ExportRenderer(typeof(CustomGrid), typeof(CustomGridRenderer))] public class CustomGridRenderer : ViewRenderer { public override UIKit.UIView HitTest(CoreGraphics.CGPoint point, UIKit.UIEvent uievent) { UIView hitView = base.HitTest(point, uievent); if (hitView == this) { return null; } return hitView; } } 

Thus, you should not explicitly set InputTransparent for iOS, and any taps in the Grid system itself are sent to anything below. Since Android works with InputTransparent , however, in this particular case, you can wrap this inside the Device.OnPlatform and skip the implementation of the Android custom renderer if you don't want to:

 Device.OnPlatform(Android: () => { overlay.InputTransparent = true }); 

Using your previous code modified to use the CustomGrid and iOS renderer, I can press both buttons.

+3
source

All Articles