Keywords

1 Introduction

Interior design has been practiced since the days of the Ancient Egyptians and is defined as the art and science of enhancing the interior of a space to achieve a polished and more aesthetically pleasing environment. Engaging the services of an interior designer can be costly. Individuals who are not able to engage an interior designer tend to design their room or space by visiting furniture showrooms and social media sites for design ideas. However, the lack of experience in designing a room or space may result in issues such as not being able to visualize what the space will look like with a particular set of furniture or decorations by looking at a catalog or an image (Tan, 2013) or even a new color for the wall or floor (Phan & Choo, 2010; Tan, 2013). Additionally, inexperienced individuals who lack the skills to estimate the dimensions of the space may risk purchasing unsuitable furniture that would not fit into the designated space (Sandu & Scarlat, 2018).

Recent advancements in technology have offered the potential for individuals to design their room or space using Augmented Reality (AR). AR enables virtual objects to be visualized in real-life environments. Individuals are able to visualize how a specific piece of furniture would look in their room or space using their smartphones. This greatly minimizes the possibility of purchasing wrong or unsuitable furniture. In addition, AR offers the potential to redesign an entire room, from the color of the walls to the desired decorations. Evidence from the literature suggests that AR-enabled interior design applications that are currently in the market only allow users to view the furniture on the horizontal plane and does not allow users to place wall furniture such as picture frames, wall shelves, wall lights, etc., on the vertical plane, which limits the users’ ability to fully design a room and space. Furthermore, current AR interior design applications that are currently in the market are not shareable, which does not allow users to design collaboratively (Sandu & Scarlat, 2018).

The aim of this research is to develop a collaborative AR-enabled application for interior design that overcomes the limitations of existing interior design applications. To achieve this aim, the following objectives are proposed:

  1. 1.

    Examine the limitations of existing interior design applications

  2. 2.

    Design a collaborative AR-enabled interior design environment

  3. 3.

    Develop an enhanced AR-enabled mobile application that overcomes the limitations of the existing interior design applications

In Sect. 2, a literature review examining the previous studies of proposed AR-enabled interior design applications is presented, and a comparison of existing AR-enabled interior design applications has been made to examine the limitations of each of the applications. In Sect. 3, the methodology used for the development of the proposed application which is the Rapid Application Development model is discussed. In Sect. 4, the results of the proposed application are outlined with emphasis on the functionality of the proposed application such as the collaborative session, changing wall and floor color, and placing furniture on both vertical and horizontal plane functionality. The technicalities of the implementation of these functionalities are also further discussed. In Sect. 5, a comparison of the functionality between the proposed AR-enabled application (ARID) and existing applications are discussed. In Sect. 6, we present our conclusion by discussing the weaknesses and future improvements of the proposed application.

2 Literature Review

Augmented Reality (AR) is a technology or environment that blends the real environment with a virtual environment. The user will be able to see and interact with virtual world objects presented in the real world (Nasir et al., 2018). As proposed by Milgram and Kishino (1994), Augmented Reality (AR) together with Augmented Virtuality (AV) are two core elements under the Mixed Reality (MR) range of the Reality–Virtuality (RV) continuum, as illustrated in Fig. 1. The difference between Augmented Reality (AR) and Virtual Reality (VR) is that rather than putting the user in a completely virtual world through VR, AR displays the additional information on top of the user’s familiar environment (Wang, 2009). Following the advancement in AR technology, AR has been implemented in many fields to resolve everyday issues that benefit the user by not only saving time, but at the same time saving the cost also (Nasir et al., 2018). The growth in the use of smartphones has also been beneficial as more people are able to experiment with AR. Recently, AR has been implemented in fields such as health care, education, and interior design, which is the focus of this research.

Fig. 1
figure 1

Milgram and Kishino’s Reality–Virtuality continuum (Milgram & Kishino, 1994)

2.1 Previous Studies of Proposed AR-Enabled Interior Design Applications

This section will examine the previous studies of proposed AR-enabled interior design applications to understand the objectives, technology used, and approaches of each of the studies and the limitations of each proposed application. The interior design and architectural fields have gained popularity and interest in AR. Existing research on AR applications has made suggestions for the implementation of AR-enabled applications for interior design. Table 1 provides a comparison of selected existing literature on AR-enabled applications.

Table 1 Comparison of previous literature

Tang et al. (2014) proposed an application that automatically arranges furniture in a tight space using spatial and functional relationships using AR. The application aids people living in high-density areas to determine whether a piece of furniture fits into their home. The application was developed using a Microsoft KINECT depth sensor to generate a 3D point cloud that contains information on the depth of the objects. Depth sensors are a distance sensing technology normally used to measure distances. A 3D point cloud is a set of data points in space that is used to contain information of the depth of the objects. Information such as the supported surfaces and the sizes are then estimated by the depth map. The information is then processed to allow the computer to generate a virtual environment of the targeted space. The limitations of the Microsoft KINECT technology concluded by Tang was that the depth map captured by the KINECT sensor is highly affected by the lighting of the room, which will result in poor plane estimation.

In another research, Gurcinar et al. (2018) proposed an application using Augmented Reality in Interior Design Education. The AR-enabled application was created using Unity to track and display two different virtual 3D models through the devices on top of a printed 2D interior plan on the table. Students who tested the AR mobile application agreed that it improved the perception of the given space, thereby improving motivation and creativity. The limitations of this proposed application were that the application only allowed the students to view the virtual 3D models and was lacking interactivity (Gurcinar & Esen, 2018).

On the other hand, Sandu and Scartalt (2018) mentioned in a research that interior design applications that are currently in the market, i.e., IKEA Place, Houzz, and Homestyler, only allow the user to visualize a piece of certain furniture, and is not able to completely redesign a room. With the recent implementation of depth sensors in mobile phones, there is potential for the user to easily scan a room using their mobile devices. The method proposed by Sandu states that to scan the entire room, the user just has to walk to certain points of the room indicated by the application to enable the application to determine the size and the shape of the given room (Sandu & Scarlat, 2018). In contrast, with the introduction of the LIDAR scanner in mobile phones, scanning a room can be done just by pointing the camera in the direction of the wall, window, or door.

2.2 Case Study on Existing AR-Enabled Interior Design Applications

In this section, the existing AR-enabled interior design applications that are currently available for download on the Apple App Store are compared and each of the application’s limitations is examined. Table 2 summarizes the functionalities implemented in the existing AR-enabled interior design applications.

Table 2 Comparison of existing AR-enabled interior design applications

The IKEA Place application was introduced by IKEA, a renowned ready-to-assemble Swedish furniture company. Their application is capable of displaying 3D rendering of more than 2000 products from IKEA’s catalog. The displayed product can be viewed from different angles, with realistic textures, fabric, lighting, and shadows (Ozturkcan, 2020). IKEA Place is one of the first applications developed with ARKit. ARKit fully utilizes the iPhone’s motion sensors and camera to give the user an accurate representation of the products. The application received approving reviews and still maintains its popularity while being referred to as “the pioneer AR experience in retail” (Ozturkcan, 2020).

Houzz is a home décor company providing services such as connecting homeowners to home professionals and home renovation. Houzz introduced AR into its “View in My Room” feature and presently allows a preview of the selected furniture in 3D (Goode, 2017). The AR “View in My Room” function supports the placing of more than one 3D furniture and also supports occlusion for a more realistic experience, although not effective and responsive. The application also supports AR preview of decorations such as paintings that can be placed on vertical surfaces like a wall.

Homestyler is a 3D interior design application that has an additional AR function. The application provides the user with the option of designing the room in 3D or AR. The AR function works fairly similarly to the IKEA Place app, allowing the users to select furniture from a catalog and preview the furniture using AR. The AR mode on the Homestyler app does not support multiple surface tracking and does not support occlusion. However, it supports the basic functionalities of an AR application such as placing multiple objects and being able to rotate and reposition the object.

The review of the three existing AR applications for interior design clearly shows that all three applications are only able to display furniture. For example, IKEA Place and Houzz only allow the user to insert furniture models into the AR environment, some additional features such as generating floor plan to see the placement of the furniture are still not available (Sandu & Scarlat, 2018). Additionally, these applications only support single-user use and lack shareability—users are not able to share what they are seeing with other users.

3 Methodology

The methodology that was used in the development of this project is the Rapid Application Development Model (RAD) (Martin, 1992). The RAD model is a software development methodology that does not require huge amounts of pre-planning. It requires the developer to produce a prototype of the application as soon as possible. Typically, RAD is implemented in projects that have a tight timeline and uses prototyping together with high-level development tools and techniques (Coleman & Verbruggen, 1998). These characteristics suit the development of this project as the timelines for this project were short—3 months. The flexibility of RAD allowed for more efficient implementation of the application during development (Geambasu et al., 2011). The RAD model can be split into four phases (Fig. 2), which are the requirements planning phase, the user design phase, the construction phase, and lastly the cutover phase.

Fig. 2
figure 2

Rapid Application Development Methodology (Martin, 1992)

In the requirements planning phase, a survey questionnaire was carried out to gather the requirements for the proposed application. The target audience for this questionnaire was individuals between 20 and 40 years of age who were interested to furnish or design their room or space. A total of 84 responses were gathered and analyzed resulting in the following list of proposed functionalities in the proposed application: multiple surface scanning, place objects on walls, place multiple AR objects, object interaction (rotating AR object, repositioning AR object), collaborative session (place shareable AR object), change wall color, change floor color, and occlusion. Design documents were then produced from the proposed functionalities. Figure 3 presents a high-level use-case diagram for the proposed application.

Fig. 3
figure 3

High-level use-case diagram for the proposed application

In the next phase, the user design phase, the ARID application prototype was developed using Unity version 2020.2.0f1. To develop AR applications in Unity, AR Foundation, ARKit, and XR Interaction Toolkit were installed. Photon Engine, a multiplayer game server generally used for developing online multiplayer games was used to enable the implementation of the server for the collaborative session. Finally, Xcode was used to compile the code and build the application after developing it in Unity. Table 3 summarizes the software used during the implementation of the proposed application.

Table 3 Software used during implementation

For the furniture and decoration 3D objects in the application, the 3D objects used were from the asset pack ArchVizPRO Interior Vol.5 created by ArchVizPRO (2020). The asset pack includes more than 150 high-quality interior 3D models with photorealistic materials and textures. The asset pack was downloaded from the Unity Asset Store and then imported into Unity. To reduce the application size for better load time due to the large size of each furniture 3D object, all of the furniture and decoration 3D objects are hosted remotely. To achieve this, a scriptable object was first created that includes the details of each piece of furniture and decoration. The details include the name of the furniture, price, thumbnail, description, materials, and the 3D model. All of the available furniture and decoration 3D objects were stored as a scriptable object and set as Addressable; by setting the furniture and decoration 3D objects as Addresables, we may utilize the Unity Addressable Asset System to allow the application to load assets by its “address” to enable the system to locate the addressable asset on the remote content delivery network, this will make for better asset management. The Addressables are then stored in an asset group. In the Addressable settings, the Addressables are set to be stored remotely, telling the system to retrieve these prefabs from the link provided. The asset group is then built and the files are uploaded to the remote content delivery network of choice, the Google cloud bucket.

Each prototype was developed incrementally, adding functionality to each prototype. The prototype for ARID was developed in 3 iterations. In the first iteration, the placements of objects on walls, placements of multiple AR objects, and object interaction functionality were developed. The second iteration focused on adding the change floor color and change wall color functionality, and the last iteration focused on adding in the collaborative session functionality. After each iteration of the prototype, a series of unit tests were conducted by testing each function through a smartphone with the prototype installed to ensure proper functionality. When the prototype has met all the user requirements and passed all unit tests, the prototype is brought into the construction phase where the prototype is converted into the final model by implementing the UI and the AR session instructions to the user, the final model is then tested again using unit testing, together with integration testing where the different modules were combined to test the interface link between the different modules and ensure data is flowing correctly through the system, and user acceptance testing where an interview was carried out. Five end users with different experience levels with AR applications were selected to join an interview in order to gather a more diverse perspective toward the application. The focus group sessions were carried out through online video conferences due to the pandemic.

Lastly, the cutover phase is where the production-ready application is distributed. The proposed ARID application is currently being finalized for distribution.

4 Results

This section explains navigation in the application and reflects on how the proposed functionalities in the AR-enabled interior design (ARID) application were implemented. Figure 4 illustrates the launch screen of the proposed application—start a session. The start a session screen allows the users to select between starting a solo session or a collaborative session.

Fig. 4
figure 4

Select session screen

4.1 Place AR Object and AR Object Interaction

Figure 5 depicts the flow of the place AR object function once an AR session is selected. This session allows users to select, place, rotate, reposition, and remove AR objects. This enables users to design a room freely to better envision the room in its complete form. Users are also able to determine whether a piece of furniture or decoration will fit in the space, thereby minimizing the risk of purchasing unsuitable furniture.

Fig. 5
figure 5

Place AR object function algorithm flowchart

Figure 6 depicts the initial screen that is shown when the user starts an AR session. Figure 7 depicts the operation of placing an AR object in the AR session. Figure 8 depicts the operation of selecting and manipulating an AR object in the AR session. In Fig. 8, the manipulation depicts the rotating of the AR object and changing the position of the AR object. Figure 9 depicts placing multiple AR objects in an AR session.

Fig. 6
figure 6

AR session screen

Fig. 7
figure 7

Place AR object

Fig. 8
figure 8

Rotate AR object and change AR object position

Fig. 9
figure 9

Placing multiple AR objects

The Unity XR Interaction Toolkit package was used for the implementation of the place AR object and AR object interaction function. The AR object interaction enables rotation of the AR object and repositioning of the AR object. The built-in scripts in the XR Interaction Toolkit were used to enable manipulation of the AR object which includes object placement, object selection, object rotation, object scaling, and object translation. These manipulations used were implemented using the ARBaseGestureInteractable class instead of MonoBehaviour to allow the application to detect gestures on the device.

For the placement of the AR object function, the existing placement script in the XR Interaction Toolkit did not suit the requirements of this project, thus a placement indicator was implemented for more accurate placing of the objects. The placement indicator was set to the center of the screen and would instantiate the object on the placement indicator when the user taps on the screen.

Figure 10 and Fig. 11 depicts the menu window for selecting a piece of furniture or decoration. Users are able to select furniture and decoration from a store’s catalog and visualize the look and feel of the space or room that the user is decorating. Figure 12 provides a detailed look and description of the furniture or decoration.

Fig. 10
figure 10

Furniture panel in menu window

Fig. 11
figure 11

Decorations panel in menu window

Fig. 12
figure 12

Object detail panel

4.2 Change Wall and Floor Color

The change wall and floor color function in the AR session provides the user with an extended ability to visualize the complete design of the room. Meshing was implemented to allow the change wall and floor color function to work as intended. The approach requires the placement of a plane object on the surface and to use meshing to occlude the objects in front of the surface. The AR Mesh Manager script provided by AR Foundation was used to generate the mesh with the data generated from the LIDAR scanner of the device, the mesh will be created around the physical objects in the scene and be registered as a virtual object in the virtual scene, this function is crucial to providing the user with a realistic perception of the walls or floor changing color, this was combined with the place AR object function to place the plane on the desired wall or floor. To change the color of the placed plane, the application simply changes the material of the plane to the desired color. Figure 13 depicts the operation of changing the wall and floor color in the AR session.

Fig. 13
figure 13

Change wall/floor color

4.3 Collaborative Session

Figure 14 illustrates the create or join collaborative session screen. Users are able to create or join a collaborative session, enabling the user to design collaboratively by sharing ideas and inspiration.

Fig. 14
figure 14

Create/join collaborative session screen

Figure 15 depicts the flow of the collaborative session function once a collaborative session is started. Figure 16 depicts the screen elements of the AR collaborative session upon launching the collaborative session. Figure 17 depicts the operation of placing a shareable AR object in the AR collaborative session. Figure 18 depicts two devices currently starting an AR collaborative session.

Fig. 15
figure 15

AR collaborative session algorithm flowchart

Fig. 16
figure 16

AR collaborative session screen

Fig. 17
figure 17

Place shareable AR object

Fig. 18
figure 18

Multiple device in an AR collaborative session

The collaborative session was implemented using the AR participant manager, collaborative session script of AR Foundation, and Photon Engine. The collaborative session script enables users to join the same AR session using a local network. Users should have the same service type to join the same AR session. When in the AR session, the collaborative session uses the ARKit Multipeer Connectivity framework to send and receive collaborative data. The collaborative data only includes Trackables data such as participants, anchors, point clouds, etc.

Once the collaborative session is set up, Photon Engine is implemented to allow all users to sync the objects placed in the session through the Photon server. When creating a collaborative session, the user will first connect to the Photon server. A PunRPC, which stands for “Remote Procedure Calls” is an attribute that allows a method to get called on remote clients in the same room. It is called whenever a user selects an object to sync the object-to-instantiate across all the participants in the AR session. To place a shareable AR object, an anchor is first placed and synced across all users and the selected AR object is then placed on the position of the anchor to ensure the position is accurate for all the users in the collaborative session.

5 Discussion

Table 4 compares the proposed ARID application with existing interior design mobile applications. Our proposed application—ARID—consists of all the functionalities of existing interior design applications and has also successfully overcome the limitations of existing AR-enabled interior design applications by implementing additional functionalities such as scanning multiple surfaces (a limitation identified in the IKEA Place and Homestyler app), place object on walls (a limitation identified in the IKEA Place app). The proposed ARID application is also able to start a collaborative session, change wall color and change floor color which is currently not implemented in the existing AR-enabled applications. The proposed ARID application also supports occlusion, which is not supported by the Houzz and Homestyler applications.

Table 4 Comparison of the proposed application (ARID) with existing applications

Although our ARID application has successfully met the requirements of the end users, we acknowledge that our application has strengths and limitations. ARID application allows the user to design a room by placing multiple objects on multiple surfaces, for example, placing a table on the ground, then placing a decoration such as a painting on the wall to enable a more complete view of the entire design of the room. A function that is available in the ARID application and not in existing AR applications for interior design is the ability to change the color of the wall and floor. Users can preview the feel and aesthetic of the room or space with different colors on the wall and floor. Additionally, the AR collaborative session in the proposed application enables individuals to connect and combine creative ideas by allowing multiple users to design a space or room. This minimizes problems faced when designing a room or space such as not being able to visualize the room or space in the final design and not being able to know if a piece of certain furniture or decoration will fit into a space.

When considering the user interface (UI) design of the proposed ARID application, a minimal and clean design approach was taken to ensure minimal interference with the immersiveness of the application and to allow for better navigation throughout the different functions.

The limitations of the proposed application lie within the change in wall and floor color, and collaborative session functions. The change wall and floor color function require further development and experimentation that allows the detection of a wider range of colors in the room or space and to change them. Further, the collaborative session only supports placing shareable AR objects and does not allow for interaction with the shareable object such as repositioning and rotating. Limitations in the type of hardware used are another factor. As the proposed application was developed using Apple ARKit, this application is presently not available for Android users. Finally, the change wall and floor function require a device with LIDAR capability. Hence, users with models iPhone 11 and older will not be able to use this function.

6 Conclusion

The resulting ARID application stands out from existing AR-enabled interior design applications with its added functionality and improved user interface (UI) design. The added functionality includes: enabling users to place furniture on both horizontal and vertical planes, change the color of the wall and floor, and start a collaborative session. The improved user design was verified through the feedback gathered from the user acceptance testing which revealed that the application was easy to use and the added functionality was beneficial for interior design. It is hoped that future work to improve the ARID application will be able to refine the limitations of the application. Artificial Intelligence may be introduced for the change wall and floor color function and to include the option to change the texture of the walls and floor. This will allow the function to be more precise and accurate with the results. There is potential to improve the collaborative session function, where the code could be further dissected to directly send data such as prefab data and prefab location and rotation. This will eliminate the need to implement Photon Engine.