The cinema of the United States, consisting mainly of major film studios (also known as Hollywood) along with some independent film, has had a large effect on the global film industry since the early 20th century. The dominant style of American cinema is classical Hollywood cinema, which developed from 1913 to 1969 and is still typical of most films made there to this day. While Frenchmen Auguste and Louis Lumière are generally credited with the birth of modern cinema, American cinema soon came to be a dominant force in the emerging industry.