In the AI era, the support of computing power, algorithms, and data plays a decisive role in advancing the development of large-scale foundation models. Cloud computing is renowned for its remarkable flexibility and scalability, making it an indispensable component of the infrastructure necessary for training these foundation models. The concept of joint-cloud, commonly known as joint-cloud computing (JCC), leverages cutting-edge communication and network technologies to achieve high-speed interconnectivity between data centers and various computing resources. JCC endeavors to establish a convenient resource access framework, standardize task scheduling, and create a sustainable operational model that caters to the specific requirements of large-scale basic model training and inference development. These efforts contribute significantly to the overall advancement of computing and networking infrastructure. However, the adoption of JCC introduces numerous technical challenges and complexities in establishing, overseeing, and managing multi-hybrid cloud environments. Additionally, the training of large foundation models hinges on the effective utilization of local computing resources and the integration of supercomputing resources, intelligent computing resources, and more. JCC encompasses a wide array of cloud paradigms, including but not limited to hybrid cloud, inter-cloud, multi-cloud, cloud federation, cross-cloud, cloud service brokerage, and the establishment of intelligent computing networks.