Indoor localization methods provide pose information in their own virtual coordinate systems. Adjusting these custom virtual spaces to the real physical spaces can be a complex and high cost (manpower, equipment) procedure. Verifying whether the adjusted poses accurately reflect the poses in the real space or not is also a difficult task. This paper proposes the application of the WYSIWYG (What-You-See-Is-What-You-Get) style for indoor localization systems as a tool for physical Human-Robot Interaction (pHRI) experiments. To realize this, we have constructed a system using floor projection and a self-developed virtual to real coordinate system adjustment tool with an easy to use, intuitive user interface. The calculated real positions are shown in real-time in the same space where the real tracked objects are. Therefore an operator can instantly verify and adjust the virtual-real coordinate transformation parameters, effectively minimizing the cost of calibration and possibly increasing accuracy. Our proposed system is capable of gathering positions from indoor localization systems and providing the transformed real-space pose information via different methods including a common standard interface, in the form of Robot Operating System (ROS) messages. For the proof of concept system we have used the trackers of the HTC Vive system as a localization data source and a consumer grade projector resulting in a low-cost solution. Our preliminary experiments show that the proposed WYSIWYG system provides a suitable environment for pHRI scenarios, and can also provide secondary functions e.g. intention projection, augmented reality applications etc. Furthermore, as a tool it can facilitate the conversion of simulated HRI scenarios into real physical experiments.