Training deep neural networks with multi-domain data generally gives more robustness and accuracy than training with single domain data, leading to the development of many deep learning-based algorithms using multi-domain data. However, if part of the input data is unavailable due to missing or corrupted data, a significant bias can occur, a problem that may be relatively more critical in medical applications where patients may be negatively affected. In this study, we propose the Laplacian filter attention with style transfer generative adversarial network (LASTGAN) to solve the problem of missing sequences in brain tumor magnetic resonance imaging (MRI). Our method combines image imputation and image-to-image translation to accurately synthesize specific sequences of missing MR images. LASTGAN can accurately synthesize both overall anatomical structures and tumor regions of the brain in MR images by employing a novel attention module that utilizes a Laplacian filter. Additionally, among the other sub-networks, the generator injects a style vector of the missing domain that is subsequently inferred by the style encoder, while the style mapper assists the generator in synthesizing domain-specific images. We show that the proposed model, LASTGAN, synthesizes high quality MR images with respect to other existing GAN-based methods. Furthermore, we validate the use of LASTGAN for data imputation or augmentation through segmentation experiments.