NASA TM 88946
NASA STD7009 建模与仿真标准★

NASA-STD-7009NASA TECHNICALSTANDARDApproved: 07-11-2008 National Aeronautics and Space AdministrationWashington, DC 20546-0001Expiration Date: 07-10-2013Superseding NASA-STD-(I)-7009 STANDARD FOR MODELS AND SIMULATIONSMEASUREMENT SYSTEM IDENTIFICATION:NOT MEASUREMENT SENSITIVEDOCUMENT HISTORY LOGApproval Date Description Status DocumentRevisionInterim 12-01-2006 Interim Baseline ReleaseRelease Baseline 07-11-2008 BaselineFOREWORDThis standard is published by the National Aeronautics and Space Administration (NASA) to provide uniform engineering and technical requirements for processes, procedures, practices, and methods that have been endorsed as standard for models and simulations (M&S) developed and used in NASA programs and projects, including requirements for selection, application, and design criteria of an item. This standard was specifically developed to respond to Action 4 from the 2004 report “A Renewed Commitment to Excellence,” with consideration also given to related findings as identified in the Columbia Accident Investigation Board (CAIB) Report.This standard is approved for use by NASA Headquarters and NASA Centers, including Component Facilities.This standard covers the development and operation (or execution) of M&S, as well as the analysis and presentation of the results from M&S. This also includes the proper training ofM&S practitioners and the identification of recommended practices, while ensuring thecredibility of the results from M&S is assessed and properly conveyed to those making critical decisions.Requests for information, corrections, or additions to this standard should be submitted via “Feedback” in the NASA Technical Standards System at .Original Signed By July 11, 2008Michael G. Ryschkewitsch Approval DateNASA Chief EngineerSECTION TABLE OF CONTENTSPAGEDOCUMENT HISTORY LOG (2)FOREWORD (3)TABLE OF CONTENTS (4)LIST OF FIGURES (6)LIST OF TABLES (6)1. SCOPE (7)1.1 Purpose (7)1.2 Applicability (8)1.3 Focus (9)2. APPLICABLEDOCUMENTS (9)2.1 General (9)2.2 GovernmentDocuments (9)2.3 Non-GovernmentDocuments (9)2.4 Order of Precedence (9)3. ACRONYMSANDDEFINITIONS (10)3.1 Acronyms and Abbreviations (10)3.2 Definitions (10)4. REQUIREMENTS (14)4.1 Programmatics (15)4.2 Models (16)4.3 Simulations and Analyses (18)4.4 Verification, Validation, and Uncertainty Quantification (19)4.5 Identification and Use of Recommended Practices (21)4.6 Training (22)4.7 Assessing the Credibility of M&S Results (23)4.8 Reporting Results to Decision Makers (24)5.GUIDANCE (26)5.1 ReferenceDocuments (26)5.2 KeyWordListing (28)TABLE OF CONTENTS, continuedSECTION PAGE Appendix A M&S Risk Assessment (29)Appendix B Credibility Assessment Scale (31)Appendix C Compliance Matrix (51)LIST OF FIGURESFigure Title Page 1 Sample M&S Risk Assessment Matrix (30)Scale (31)2 CredibilityAssessmentWeights (45)3 Subfactor4 Bar Chart and Radar Plot for Factor Scores (47)Illustration (49)Flag5 DeficiencyFlagIllustration (50)6 Exceedance7 Sufficiency Thresholds and Color Coding on Bar Chart and RadarPlot for Factor Scores (50)LIST OF TABLESTable Title Page1 Key Aspects of Credibility Assessment Levels (33)2 Level Definitions for Evidence Subfactors in the M&SDevelopment Category (37)3 Level Definitions for Evidence Subfactors in the M&S OperationsCategory (40)4 Level Definitions for Factors in the Supporting Evidence Category.. 425 Level Definitions for the Technical Review Subfactors (44)6 Roll-up of Subfactor Scores to Factor Score (46)7 Roll-up of Factor Scores to Overall Score (47)8 Roll-up of Factor Scores to Category Score (48)STANDARD FOR MODELS AND SIMULATIONS1. SCOPE1.1 PurposeThis standard was developed in response to Action 4 from the 2004 report “A Renewed Commitment to Excellence,” which stated the following:“Develop a standard for the development, documentation, and operation of models and simulationsa.Identify best practices to ensure that knowledge of operations is captured in the userinterfaces (e.g., users are not able to enter parameters that are out of bounds),b.Develop process for tool verification and validation, certification, reverification,revalidation, and recertification based on operational data and trending,c.Develop standard for documentation, configuration management, and qualityassurance,d.Identify any training or certification requirements to ensure proper operationalcapabilities,e.Provide a plan for tool management, maintenance, and obsolescence consistent withmodeling/simulation environments and the aging or changing of the modeledplatform or system,f.Develop a process for user feedback when results appear unrealistic or defyexplanation.”Subsequently, in 2006, the NASA Chief Engineer provided the following further guidance:g.“Include a standard method to assess the credibility of the models and simulationspresented to the decision maker when making critical decisions (i.e., decisions thateffect human safety or mission success) using results from models and simulations,h.Assure that the credibility of models and simulations meet the project requirements.” Each of the requirements and recommendations in this standard can be traced to one or more of the eight objectives listed above. The traceability matrix of the requirements in this standard to the eight objectives can be found online upon accessing this standard at URL; refer to “Requirements Traceability Matrix.” Some of these objectives are met by recommendations rather than by requirements as a result of either (a) the practical impossibility of satisfying the requirement in all cases, or (b) further guidance received from NASA Headquarters.These eight objectives are encapsulated in the overall goal for this standard, which is to ensure that the credibility of the results from M&S is properly conveyed to those making critical decisions. Critical decisions based on M&S results, as defined by this standard, are those technical decisions related to design, development, manufacturing, ground, or flight operations that may impact human safety or program/project-defined mission success criteria. The intent is to reduce the risks associated with critical decisions. This standard covers the development and operation (or execution) of M&S as well as the processes of analysis and presentation of the results from the M&S.This standard addresses aspects of M&S that are common across NASA activities. Discipline-specific details of M&S should be addressed in future documents, such as Recommended Practices (usually entitled “Handbooks” in the NASA document hierarchy), and are not included in this standard.The scope of this standard covers the development and maintenance of models, the operation of simulations, the analysis of the results, training, recommended practices, the assessment of the M&S credibility, and the reporting of the M&S results. Some of the key features of this standard are requirements and recommendations for verification, validation, uncertainty quantification, training, credibility assessment, and reporting to decision makers; also included are the cross-cutting areas of documentation and configuration management (CM).The requirements/recommendations in sections 4.7 and 4.8 are the culmination of the standard. The requirements/recommendations in sections 4.1 – 4.6 are intended to support the requirements in sections 4.7 and 4.8. This is accomplished by ensuring that sufficient details of the M&S process along with intermediate results are available to support the requirements in sections 4.7 and 4.8 and to respond to in-depth queries by the decision maker. Appendix A provides guidance for assessing the risk of using M&S in engineering decisions. Appendix B provides details related to some of the requirements/recommendations in sections 4.7 and 4.8. Appendix C contains a template for a compliance matrix.1.2ApplicabilityThis standard applies to M&S used by NASA and its contractors for critical decisions in design, development, manufacturing, ground operations, and flight operations. (Guidance for determining which particular M&S are in scope is provided in section 4.1 and Appendix A.) This standard also applies to use of legacy as well as commercial-off-the-shelf (COTS), government-off-the-shelf (GOTS), and modified-off-the-shelf (MOTS) M&S to support critical decisions. Generally, for such M&S, particular attention may need to be paid to defining the limits of operation and to verification, validation, and uncertainty quantification. Programs and projects are encouraged to apply this standard to M&S, if the M&S results may impact future critical decisions.This standard does not apply to M&S that are embedded in control software, emulation software, and stimulation environments. However, Center implementation plans for NPR 7150.2, NASA Software Engineering Requirements, should specifically cover embedded M&S, and addresssuch M&S-specific issues as numerical accuracy, uncertainty analysis, sensitivity analysis, M&S verification, and M&S validation.This standard may be cited in contract, program, and other Agency documents as a technical requirement. Requirements are indicated by the word “shall”; explanatory or guidance text is indicated in italics.1.2.1Tailoring for application to a specific program or project shall be formally documented as part of program or project requirements and approved by the Technical Authority.1.3 FocusIn general, standards may focus on engineering/technical requirements, processes, procedures, practices, or methods. This standard focuses on requirements and recommendations. Hence, this standard specifies what shall or should be done; it does not prescribe how the requirements are to be met, nor does it specify who is the responsible party for complying with the requirements.DOCUMENTS2. APPLICABLE2.1 GeneralThe documents listed in this section contain provisions that constitute requirements of this standard as cited in the text of section 4.2.1.1The latest issuances of cited documents shall be used unless otherwise approved by the assigned Technical Authority.The applicable documents are accessible via the NASA Online Directives Information System at and the NASA Technical Standards System at, or may be obtained directly from the Standards Developing Organizations or other document distributors.2.2Government DocumentsNone.2.3Non-Government DocumentsNone.2.4 Order of PrecedenceThis document establishes requirements and guidance for models and simulations but does not supersede nor waive established Agency requirements found in other documentation.2.4.1Conflicts between this standard and other requirements documents shall be resolved by the responsible Technical Authority.3. ACRONYMS AND DEFINITIONS3.1Acronyms and AbbreviationsAIAA American Institute of Aeronautics and AstronauticsASME American Society of Mechanical EngineersCAIB Columbia Accident Investigation BoardScaleCAS CredibilityAssessmentManagementCM ConfigurationCOTS Commercial-Off-The-ShelfCPIAC Chemical Propulsion Information Analysis CenterDMSO Defense Modeling and Simulation OfficeGOTS Government-Off-The-ShelfIEEE Institute of Electrical and Electronics EngineersISG Implementation Study GroupOrganization for StandardizationISO InternationalJANNAF Joint Army-Navy-NASA-Air ForceSimulationsandM&S ModelsMOTS Modified-Off-The-ShelfNASA National Aeronautics and Space AdministrationNASTRAN NASA Structural AnalysisNPR NASA Procedural RequirementsPMBA Primary Mirror Backplane AssemblyReq. RequirementGuideRPG RecommendedPracticesInteroperability Standards OrganizationSISO SimulationSTD StandardValidationV&V Verification&Validation, and AccreditationVV&A Verification,3.2 DefinitionsThe definitions listed below are those used in this document. Wherever possible, these definitions have been taken from official NASA documents. In some cases, after reviewing definitions of interest in the International Organization for Standardization (ISO), the Defense Modeling and Simulation Office (DMSO), professional society publications, and English language dictionaries, some of these definitions were taken or adapted from these sources to achieve the goal and objectives stated in section 1.1. Some definitions may have alternate meanings in other documents and disciplines.Abstraction: The process of selecting the essential aspects of a reference system to be represented in a model or simulation while ignoring those aspects that are not relevant to the purpose of the model or simulation (adapted from Fidelity ISG Glossary, Vol. 3.0).Accuracy: The difference between a parameter or variable (or a set of parameters or variables) within a model, simulation, or experiment and the true value or the assumed true value.Analysis: Any post-processing or interpretation of the individual values, arrays, files of data, or suites of executions resulting from a simulation.Artifact: Any tangible product that is produced by the project team, i.e., requirements documents, help systems, code, executables, test documentation, test results, diagrams, etc.Calibration: The process of adjusting numerical or modeling parameters in the model to improve agreement with a referent.Model: The numerical representation of the mathematical model.ComputationalConceptualModel: The collection of abstractions, assumptions, and descriptions of physical processes representing the behavior of the reality of interest from which the mathe-matical model or validation experiments can be constructed (adapted from ASME V&V 10).Configuration Management (CM): A management discipline applied over the product's life cycle to provide visibility into and to control changes to performance, functional, and physical characteristics (NPR 7120.5D, NASA Space Flight Program and Project Management Requirements).Credibility: The quality to elicit belief or trust in M&S results.Decision: Those technical decisions related to design, development,Criticalmanufacturing, ground, or flight operations that may impact human safety or mission success, as measured by program/project-defined criteria.Emulation: The use of an M&S to imitate another system, so that the M&S behaves like or appears to be the other system.Endorsement: A formal assurance that a product, process, or service conforms to specified characteristics. (Examples of endorsement include “accreditation”—the official acceptance of a model or simulation and its associated data to use for a specific purpose—and “certification,” which is similar to accreditation, but often applies to a class of purposes or a general domain and generally implies an independent and/or third-party certifier.)HumanSafety: The condition of being protected from death, permanently disabling injury, severe injury, and several occupational illnesses. In the NASA context this refers to safety of the public, astronauts, pilots and the NASA workforce (adapted from NPR 8000.4 and the NASA Safety Hierarchy).Limits of Operation: The boundary of the set of parameters for which an M&S result is acceptable based on the program/project-required outcomes of verification, validation, and uncertainty quantification.Model: The mathematical equations, boundary values, initial conditions,Mathematicaland modeling data needed to describe the conceptual model (ASME V&V 10).Mission Success Criteria: Standards against which the program or project will be deemed a success. Mission success criteria may be both qualitative and quantitative, and may cover mission cost, schedule, and performance results as well as actual mission outcomes (NPR 7120.5C, NASA Program and Project Management Processes and Requirements).Model: A description or representation of a system, entity, phenomena, or process (adapted from Banks, J., ed. (1998). Handbook of Simulation. New York: John Wiley & Sons).(A model may be constructed from multiple sub-models; the sub-models and the integrated sub-models are all considered models. Likewise, any data that goes into a model is considered part of the model. A model of a model (commonly called a metamodel), e.g., a response surface constructed from the results of M&S, is considered a model).Referent: Data, information, knowledge, or theory against which simulation results can be compared (adapted from ASME V&V 10).Risk: The combination of the probability that a program or project will experience an undesired event and the consequences, impact, or severity of the undesired event, if it were to occur. Both the probability and consequences may have associated uncertainties (adapted from NPR 7120.5D).SensitivityAnalysis: The study of how the variation in the output of a model can be apportioned to different sources of variation in the model input and parameters (adapted from Saltelli and others, 2000).Simulation: The imitation of the characteristics of a system, entity, phenomena, or process using a computational model.Stimulation: The description of a type of simulation whereby artificially generated signals are provided to real equipment in order to trigger it to produce the result required for verification of a real-world system, training, maintenance, or for research and development.Subject Matter Expert: An individual having education, training, or experience in a particular technical or operational discipline, system, or process and who participates in an aspect of M&S requiring his/her expertise.Tailoring: The documentation and approval of the adaptation of the processes and approach to complying with requirements according to the purpose, complexity, and scope of a NASA program or project. (NPR 7123.1A, NASA Systems Engineering Processes and Requirements).Uncertainty: (1) The estimated amount or percentage by which an observed or calculated value may differ from the true value (The American Heritage Dictionary of the English Language, 4th ed.). (2) A broad and general term used to describe an imperfect state of knowledge or a variability resulting from a variety of factors including, but not limited to, lack of knowledge, applicability of information, physical variation, randomness or stochastic behavior, indeterminacy, judgment, and approximation (adapted from NPR 8715.3B, NASA General Safety Program Requirements).Quantification: The process of identifying all relevant sources ofUncertaintyuncertainties, characterizing them in all models, experiments, and comparisons of M&S results and experiments, and of quantifying uncertainties in all relevant inputs and outputs of the simulation or experiment.Validation: The process of determining the degree to which a model or a simulation is an accurate representation of the real world from the perspective of the intended uses of the model or the simulation.Verification: The process of determining that a computational model accurately represents the underlying mathematical model and its solution from the perspective of the intended uses of M&S.Waiver: A documented authorization intentionally releasing a program or project from meeting a requirement (NPR 7120.5D). Deviations and exceptions are considered special cases of waivers.4. REQUIREMENTSThis standard establishes a minimum set of requirements and recommendations for the use of M&S to support critical decisions.For decisions based on results from M&S, the risk assumed by the decision maker is often misestimated due to inadequate assessment of uncertainties within M&S development, verification, validation, execution, analysis, and reporting. This standard establishes practicesto enable a more accurate assessment of this risk by making M&S credibility more apparent tothe decision maker. This standard emphasizes documentation and CM of M&S to enforce transparency, repeatability, and traceability; and it requires that key M&S personnel receive appropriate training.The requirements and recommendations are generic in nature because of their broad applicability to all types of M&S. Implementation details of the M&S requirements should be addressed in discipline-specific Recommended Practices, project/program management plans, etc.The following organizational structure is employed in this standard:4.1 Programmatics4.2 Models4.3 Simulations and Analyses4.4 Verification, Validation, and Uncertainty Quantification4.5 Identification and Use of Recommended Practices4.6 Training4.7 Assessing the Credibility of M&S Results4.8 Reporting Results to Decision MakersIn many instances, the modeling, simulation, and analysis activities are interwoven, particularly during the development, verification, and validation phases. This standard is intended to be inclusive of all these possibilities.Many of the requirements in this standard require documentation. With the exception of the documentation required for reports to decision makers (section 4.8), this documentation may consist of a reference to other existing documents, such as a journal article, a technical report, or a program/project document, provided that all the required details are contained in the referenced document(s).4.1 ProgrammaticsCritical decisions that are based entirely or partially on M&S are usually made within the context of a program or project. Program and project management have the responsibility to identify and document the parties responsible for complying with the requirements in this standard.The actual person identified by program and project management to fulfill the role of the “responsible party” in specific requirements will likely vary depending upon the context of the requirement; for example, the responsible party might be the lead, or another supporting person associated with the model development, operation, analysis, and/or reporting of results to decision makers.Program and project management in collaboration with the Technical Authority have the responsibility to identify and document the extent and level of formality of documentation needed to meet the documentation requirements in this standard. Some requirements, in particular, 4.1.5,4.2.6, 4.2.8, 4.3.6, 4.4.1, 4.4.2, 4.4.4, 4.4.5, 4.4.6, 4.4.7, 4.4.8, and 4.4.9, are to be interpreted as meaning that the activity in question is not required per se, but that whatever was done is to be documented, and if nothing was done a clear statement to that effect is to be documented. Program and project management in collaboration with the Technical Authority have the responsibility to identify and document the critical decisions to be addressed with M&S and to determine which M&S are in scope.The latter determination should be based upon the risk posed by the anticipated use of the M&S. Appendix A describes a representative M&S risk assessment matrix for this purpose.Furthermore, the Technical Authority has the particular responsibility to assure appropriate outcomes of Req. 4.1.3.The responsible party performs the following:Req. 4.1.1 – Shall document the risk assessment for any M&S used in critical decisions,Req. 4.1.2 – Shall identify and document those M&S that are in scope.Req. 4.1.3 – Shall define the objectives and requirements for M&S products including the following:a. The acceptance criteria for M&S products, including any endorsement for theM&S.b. The rationale for the weights used for the subfactors in the CredibilityAssessment Scale (see Appendix B.4).c. I ntended use.d. M etrics (programmatic and technical).e. V erification, validation, and uncertainty quantification (see section 4.4).f. Reporting of M&S information for critical decisions (see section 4.8).g.CM (artifacts, timeframe, processes) of M&S.(The acceptance criteria in 4.1.3 (a) includes specification of what constitutes a favorable comparison for the Verification Evidence, Validation Evidence, Input Pedigree Evidence, andUse History level definitions in the Credibility Assessment Scale (see Appendix B).)Req. 4.1.4– Shall develop a plan (including identifying the responsible organization(s)) for the acquisition, development, operation, maintenance, and/or retirement of the M&S. Req. 4.1.5 – Shall document any technical reviews performed in the areas of Verification, Validation, Input Pedigree, Results Uncertainty, and Results Robustness (seeAppendix B).Req. 4.1.6 – Shall document M&S waiver processes.Req. 4.1.7 – Shall document the extent to which an M&S effort exhibits the characteristics of work product management, process definition, process measurement, processcontrol, process change, and continuous improvement, including CM and M&Ssupport and maintenance.4.2 ModelsThe processes of developing conceptual, mathematical, or computational models are all considered to be modeling activities. Empirically adjusting the results of a simulation in an attempt to improve correlation is considered a modeling activity.For models, the responsible party performs the following:Req. 4.2.1 – Shall document the assumptions and abstractions underlying the conceptual model, including their rationales.Req. 4.2.2 – Shall document the basic structure and mathematics of the model (e.g., reality modeled, equations solved, behaviors modeled, conceptual models).(For COTS, GOTS, MOTS, and legacy M&S, some of the documentation required in 4.2.1 and 4.2.2 may be available in published user guides; a reference to the user guides will suffice for this part of the documentation.)Req. 4.2.3 – Shall document data sets and any supporting software used in model development and input preparation.Req. 4.2.4 – Shall document required units and vector coordinate frames (where applicable) for all input/output variables in the M&S.Req. 4.2.5 – Shall document the limits of operation of models.Req. 4.2.6 – Shall document any methods of uncertainty quantification and the uncertainty in any data used to develop the model or incorporated into the model.Req. 4.2.7 – Shall document guidance on proper use of the model.(Guidance on proper use of a model includes descriptions of appropriate practices for set-up, execution, and analysis of results.)Req. 4.2.8 – Shall document any parameter calibrations and the domain of calibration.Req. 4.2.9 – Shall document updates of models (e.g., solution adjustment, change ofparameters, calibration, and test cases) and assign unique version identifier,description, and the justification for the updates.Req. 4.2.10 – Shall document obsolescence criteria and obsolescence date of the model. (Obsolescence refers to situations where changes to the real system invalidate the model—see item (e) of Diaz Action #4.)Req. 4.2.11 – Shall provide a feedback mechanism for users to report unusual results to model developers or maintainers.Req. 4.2.12 – Shall maintain (conceptual, mathematical, and computational) models and associated documentation in a controlled CM system.Req. 4.2.13 – Shall maintain the data sets and supporting software referenced in Req. 4.2.3 and the associated documentation in a controlled CM system.。
SRTM介绍

一、SRTM背景:美国利用航天飞机搭载成像雷达对地进行观测始于20 世纪80年代初。
1982 年11 月和1985 年10 月, 美国分别进行了两次称为S IR2A 与S IR2B 的航天飞机成像雷达实验, 结果证明合成孔径雷达(SAR ) 具有其他传感器所无法比拟的优势, 主要是具有全天候、全天时对地表成像与测绘的能力并具有一定的穿透能力。
1994 年4 月、10 月, 美国的宇航局(NA SA )、喷气推进实验室(JPL ) 和国家图像测绘局(N IMA ) 与德国和意大利的航天局合作, 共同在航天飞机上进行了两次称为S IR2CöX SAR 的雷达地形测绘实验。
SIR2CöX SAR 的硬件设备与前两次相比, 由单极化(HH 或VV )、单波段(L 波段) 及固定入射角提高为多极化、多波段(C,L 和X 波段) 及多入射角。
此次实验的结果表明, 星载SAR 具有利用重复轨道实现SAR 干涉测量以高精度地提取全球地形高程数据及监测地壳形变的巨大潜力。
但由于航天飞机飞行时间有限(一般为10 天左右) , 因此只能获取有限地区的重复轨道雷达数据, 且存在部分重复轨道雷达数据难以实现干涉测量的问题。
为此, 美国的有关专家提出了对现有的S IR2CöX SAR 设备进行技术改造, 并在航天飞机上增加一条可伸缩的天线杆, 以实现固定基线单轨获取全球雷达干涉测量数据并提取全球高精度 DEM 的建议。
最终, 经NA SA 与DMA (国防测绘局, 即现在的N IMA ) 协商, 达成了实施SRTM 计划的协议。
二、SRTM基本情况:SRTM(航天飞机雷达地形测绘任务)是由美国航空航天局,地理空间情报局以及德国和意大利的航天机构于2000年2月开始的。
SRTM 的全称是Shuttle Radar Topography Mission,即航天飞机雷达地形测绘使命,2000年2月11日上午11时44分,美国“奋进”号航天飞机在佛罗里达州卡那维拉尔角的航天发射中心发射升空,“奋进”号上搭载的SRTM系统共计进行了222小时23分钟的数据采集工作,获取北纬60度至南纬56度之间,面积超过1.19亿平方公里的 9.8万亿字节的雷达影像数据,覆盖全球陆地表面的80%以上,该计划共耗资3.64亿美元,该测量数据覆盖中国全境。
阿波罗计划

阿波罗计划阿波罗计划阿波罗计划(Apollo Project),又称阿波罗工程,是美国从1961年到1972年从事的一系列载人登月飞行任务。
美国于20世纪60年代至70年代初组织实施的载人登月工程,或称“阿波罗”计划。
它是世界航天史上具有划时代意义的一项成就。
工程开始于1961年5月,至1972年12月第6次登月成功结束,历时约11年,耗资255亿美元。
在工程高峰时期,参加工程的有2万家企业、200多所大学和80多个科研机构,总人数超过30万人。
计划详解1969年7月16日,巨大的“土星5号”火箭载着“阿波罗11号”飞船从美国肯尼迪角发射场点火升空,开始了人类首次登月的太空征程。
美国宇航员尼尔〃阿姆斯特朗、埃德温〃奥尔德林、迈克尔〃科林斯驾驶着阿波罗11号宇宙飞船跨过38万公里的征程,承载着全人类的梦想踏上了月球表面。
这确实是一个人的小小一步,但是整个人类迈出的伟大一步。
他们见证了从地球到月球梦想的实现,这一步跨过了5000年的时光。
阿波罗是古代希腊神话传说中的一个掌管诗歌和音乐的太阳神,传说他是月神的胞弟,曾用金人类首次登上月球(1969年7月21日)箭杀死巨蟒,替母亲报仇雪恨。
美国政府选用这位能报仇雪恨的太阳神来命名登月计划,其心情可想而知。
A波罗计划于六十年代早期在艾森豪威尔执政时被提出,作为水星计划的后续计划。
水星计划使用的航天器只能进入地球轨道,只能搭载一名宇航员,而预想中的阿波罗航天器不仅能搭载三名宇航员,也许还可以登月。
美国国家航空航天局经理阿伯〃西尔弗斯坦当时选择以希腊神话中的太阳神命名此计划。
虽然航空航天局已经开始进行计划,但艾森豪威尔对航天计划似乎并不热衷,阿波罗计划的经费始终没得到落实。
1960年11月,竞选时承诺要使美国在太空探索和导弹防御上全面超过苏联的约翰〃肯尼迪当选总统。
虽然对太空计划较为热衷,他在当选总统后并没有立刻决定开始登月计划。
肯尼迪对航天事业并不十分了解,太空探索需要的大量资金也使他不敢轻易做出决定。
阿波罗登月计划

阿波罗计划阿波罗计划阿波罗计划(Apollo Project),又称阿波罗工程,是美国从1961年到1972年从事的一系列载人登月飞行任务。
美国于20世纪60年代至70年代初组织实施的载人登月工程,或称“阿波罗”计划,它是世界航天史上具有划时代意义的一项成就。
工程开始于1961年5月,至1972年12月第6次登月成功结束,历时约11年,耗资255亿美元。
在工程高峰时期,参加工程的有2万家企业、200多所大学和80多个科研机构,总人数超过30万人。
查看精彩图册阿波罗计划-飞船▪阿波罗1号▪阿波罗4号▪阿波罗5号▪阿波罗6号▪阿波罗7号▪阿波罗8号▪阿波罗9号▪阿波罗10号▪阿波罗11号▪阿波罗12号▪阿波罗13号▪阿波罗14号▪阿波罗15号▪阿波罗16号▪阿波罗17号目录简介计划详解航天员名单辅助计划运载火箭试验飞行飞船结构登月飞行各次任务登月方案四个方案直接起飞地球轨道集合月球表面集合月球轨道集合真假分析综述由来质疑依据反驳展开简介计划详解航天员名单辅助计划运载火箭试验飞行飞船结构登月飞行各次任务登月方案四个方案直接起飞地球轨道集合月球表面集合月球轨道集合真假分析综述由来质疑依据反驳展开编辑本段简介计划详解1969年7月16日,巨大的“土星5号”火箭载着“阿波罗11号”飞船从美国肯尼迪角发射场点火升空,开始了人类首次登月的太空征程。
美国宇航员尼尔·阿姆斯特朗、巴兹.奥尔德林、迈克尔·柯林斯驾驶着阿波罗11号宇宙飞船跨过38万公里的征程,承载着全人类的梦想踏上了月球表面。
这确实是一个人的小小一步,但是整个人类迈出的伟大一步,他们见证了从地球到月球梦想的实现,这一步跨过了5000年的时光。
阿波罗是古代希腊神话传说中的一个掌管诗歌和音乐的太阳神,传说他是月神的胞弟,曾用金人类首次登上月球(1969年7月21日)箭杀死巨蟒,替母亲报仇雪恨。
美国政府选用这位能报仇雪恨的太阳神来命名登月计划,其心情可想而知。
AS8879-aerospace standard -unj

AEROSPACESTANDARDControlled Radius Root with Increased Minor DiameterSAE Technical Standards Board Rules provide that: “This report is published by SAE to advance the state of technical and engineering sciences. The use of this report is entirely voluntary, and its applicability and suitability for any particular use, including any patent infringement arising therefrom, is the sole responsibility of the user.”SAE reviews each technical report at least every five years at which time it may be reaffirmed, revised, or cancelled. SAE invites your written comments and suggestions. Copyright © 2004 SAE InternationalAll rights reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of SAE.TO PLACE A DOCUMENT ORDER: Tel: 877-606-7323 (inside USA and Canada)Tel: 724-776-4970 (outside USA)TABLE 6 - Twelve Thread Series (Continued)SAE AS8879 Revision DTABLE 7 - Sixteen Thread Series (Continued)--`,,``,,`,,,,`,`,,``,`,,````,,-`-`,,`,,`,`,,`---FIGURE 1 - External Thread Maximum Material ConditionFIGURE 2 - Internal Thread Maximum Material ConditionFIGURE 3 - Disposition of Tolerances and Crest ClearancesPREPARED UNDER THE JURISDICTION OFTHE AEROSPACE INDUSTRY SCREW THREAD CONFORMITYTASK FORCE (STC-TF) AND SAE COMMITTEE E-25, GENERAL STANDARDS FOR AEROSPACE AND PROPULSION SYSTEMS。
SRTM数据介绍与说明

SRTM数据介绍与说明2011级地理科学1班罗尧2刘金萍2 一、简介SRTM数据主要是由美国太空总署(NASA)和国防部国家测绘局(NIMA)联合测量的,SRTM的全称是Shuttle Radar Topography Mission,即航天飞机雷达地形测绘使命,2000年2月11日上午11时44分,美国“奋进”号航天飞机在佛罗里达州卡那维拉尔角的航天发射中心发射升空,“奋进”号上搭载的SRTM系统共计进行了222小时23分钟的数据采集工作,获取北纬60度至南纬56度之间,面积超过1.19亿平方公里的 9.8万亿字节的雷达影像数据,覆盖全球陆地表面的80%以上,该计划共耗资3.64亿美元,获取的雷达影像数据经过两年多的处理,制成了数字地形高程模型,SRTM地形数据按精度可以分为SRTM1和SRTM3,分别对应的分辨率精度为30米和90米数据(目前公开数据为90米分辨率的数据)。
即现在的SRTM地形产品数据。
此数据产品2003年开始公开发布,经历多修订,目前的数据修订版本为V4.1版本。
(STRM高程数据的产品的精度表见下图)二、数据特征该测量数据覆盖中国全境。
SRTM数据每经纬度方格提供一个文件,精度有 1 arc-second和 3 arc-seconds两种,称作SRTM1和SRTM3,或者称作30M和90M数据,SRTM1的文件里面包含3601*3601个采样点的高度数据,SRTM3的文件里面包含1201*1201个采样点的高度数据。
SRTM的数据组织方式为:每5度经纬度方格划分一个文件,共分为24行(-60至60度)和72列(-180至180度)。
文件命名规则为srtm_XX_YY.zip,XX表示列数(01-72),YY表示行数(01-24)。
(如图2 )目前能够免费获取中国境内的SRTM3文件,是90米的数据,每个90米的数据点是由 9个30米的数据点算术平均得来的。
每经纬度方格一个文件,文件命名方法是X1X2X3X4.hgt.zip,X1是N或S表示南北,X2是下方纬度数,X3是E 或W表示东西,X4是左方经度数。
收藏史上最详细的Landsat1-9系列数据集介绍来啦!
收藏史上最详细的Landsat1-9系列数据集介绍来啦!美国陆地卫星(LANDSAT)系列卫星由美国航空航天局(NASA)和美国地质调查局(USGS)共同管理。
⾃1972年起,LANDSAT 系列卫星陆续发射,是美国⽤于探测地球资源与环境的系列地球观测卫星系统,曾称作地球资源技术卫星(ERTS)。
陆地卫星的主要任务是调查地下矿藏、海洋资源和地下⽔资源,监视和协助管理农、林、畜牧业和⽔利资源的合理使⽤,预报农作物的收成,研究⾃然植物的⽣长和地貌,考察和预报各种严重的⾃然灾害(如地震)和环境污染,拍摄各种⽬标的图像,以及绘制各种专题图(如地质图、地貌图、⽔⽂图)等。
从 Landsat 1 到 9,所有Landsat图像均可通过 USGS Earth Explorer 公开获得。
USGS Earth Explorer链接:表 Landsat系列卫星简介Landsat 1-9 卫星介绍Landsat 1-4民⽤地球资源卫星在20世纪60年代中期就由 DOI (Department of the Interior)的USGS (国家地质调查局United States Geological Survey) 进⾏构思。
1966年9⽉20⽇,USGS在华盛顿⼀个新闻宣布会上宣布了最早的专⽤民⽤空间地球表⾯成像项⽬计划——地球资源观测卫星(EROS)。
该任务分配给了NASA,由它计划并建造卫星和相关载荷。
1966年发起的ERTS项⽬,在1975年改名为Landsat。
因此,早期的 ERTS-1 和 ERTS-2 卫星后来被重新命名为 Landsat-1 和 Landsat-2。
Landsat 1Landsat 1 于1972 年 7 ⽉ 23 ⽇发射,携带两种传感器:RBV和MSS,⼀直运⾏到 1978 年 1 ⽉,⽐其设计寿命延长了 5 年。
结果信息的质量和影响超出了所有⼈的预期。
Landsat 2Landsat 2 最初被命名为 ERTS-B,于 1975 年 1 ⽉ 22 ⽇发射。
SRTM数据
SRTM数据SRTM数据是地理信息系统(GIS)领域中常用的高程数据集之一。
SRTM (Shuttle Radar Topography Mission)是由NASA(美国国家航空航天局)和NGA (美国国家地理空间情报局)合作进行的一项任务,旨在获取全球范围内的地表高程数据。
这项任务于2000年2月至2000年2月期间进行,利用航天飞机上搭载的雷达设备对地球表面进行测量。
SRTM数据的标准格式通常是GeoTIFF(地理标记图像文件格式),它是一种常用的栅格数据格式,可以包含高程信息以及其他地理定位数据。
GeoTIFF格式的数据可以在各种GIS软件中使用和处理,如ArcGIS、QGIS等。
SRTM数据的详细内容包括以下几个方面:1. 覆盖范围:SRTM数据覆盖了全球范围,包括陆地和海洋,但主要关注陆地地表高程。
2. 空间分辨率:SRTM数据的空间分辨率通常为90米,这意味着每个像素代表90米×90米的地表区域。
3. 高程精度:SRTM数据的高程精度通常在16米到20米之间,这取决于地形复杂性和数据处理方法。
4. 数据格式:SRTM数据以GeoTIFF格式存储,其中包含高程值以及地理定位信息,如坐标系统、投影等。
5. 数据获取:SRTM数据可以通过多种途径获取,包括官方网站、GIS数据提供商、学术研究机构等。
获取SRTM数据的具体方法可以根据用户的需求和访问权限进行选择。
6. 数据处理:SRTM数据可以进行各种GIS分析和处理,如地形剖面分析、坡度和坡向计算、流域分析等。
此外,还可以与其他地理数据集进行叠加分析,如土地利用数据、水文数据等。
7. 应用领域:SRTM数据在许多领域都有广泛的应用,包括地形建模、自然资源管理、环境评估、城市规划、水资源管理等。
通过分析SRTM数据,可以获取地表高程信息,帮助决策者制定合理的规划和管理策略。
总结起来,SRTM数据是一种常用的高程数据集,以GeoTIFF格式存储,具有全球覆盖范围、90米的空间分辨率和16米到20米的高程精度。
SRTMASTERGDEMGTOPO30介绍
SRTMASTERGDEMGTOPO30介绍SRTM(Shuttle Radar Topography Mission)是一项由NASA、NGA和DLR合作实施的航天任务,旨在创建全球范围内的数字高程模型(DEM)。
SRTM使用通过航天器发射的合成孔径雷达(SAR)来测绘地球表面的高度数据。
这项任务在2000年完成,覆盖了全球大约80%的陆地,并且提供了30米分辨率的高程数据。
SRTM DEM被广泛应用于地形分析、水文建模、环境研究和地质勘探等领域。
ASTER GDEM(Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model)是由JAXA (日本航空航天研究开发机构)开发的数字高程模型。
它是由由搭载在Terra卫星上的ASTER传感器获取的地球表面遥感数据生成的。
ASTER GDEM覆盖了全球约99%的陆地表面,并提供了30米分辨率的高程数据。
ASTER GDEM广泛应用于地质、地理、气象和环境研究中的各种应用,如水资源管理、土地利用规划和自然灾害研究。
这些数字高程模型在地理信息系统(GIS)和遥感应用中发挥着重要的作用。
它们为地球科学家、环境保护专家、城市规划师和其他从事地理空间分析的专业人士提供了宝贵的工具。
这些模型不仅可以提供地表高程数据,还可以生成洼地填充、坡度和坡向等相关衍生产品。
它们被广泛应用于地质研究、水文建模、土地利用规划、自然资源管理以及环境和气候变化研究等领域。
然而,需要注意的是,这些模型都有其局限性。
SRTM和ASTERGDEM是基于遥感数据生成的,因此受到云层覆盖、遥感传感器的限制以及地表覆盖物引起的阴影的影响。
GTOPO30的分辨率较低,可能无法捕捉到更细微的地形变化。
因此,在使用这些数字高程模型时,用户需要仔细考虑其限制,并根据具体应用的需要进行合理的数据选择和处理。
由美国军用标准转化的宇航材料规范
由美国军用标准转化的宇航材料规范(AMS)清单
自美国1994年6月发布关于美军标改革的政策备忘录以来,在军用材料及热工艺方面,截止1999年1月,由美国军用标准转化而来的宇航材料规范(AMS)有151项,转化方式是将原标准代号用AMS代替,即将MS、AND、FED、MIL等改为AMS,原标准代号不变。
转化而来的AMS标准清单见表1,供参考。
表1 由美国军用标准转化的AMS清单
注:AND——美国空、海军航空设计标准(Air Force-Navy Aeronautical Design Standards, DOD);AS——美国宇航标准(Aerospace Standards, SAE);FED——美国联邦标准(FEDeral Standards);MIL——美国军用标准(MILitary Standards, DOD);MS——美国军用标准图纸(Military Standard Drawings);QQ——美国联邦规范(Federal Specifications)。
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
NASA Technical Memorandum 88946
USAAVSCOM Technical Report 86-C-3 1
Measurements of the Unsteady Flow Field Within the Stator Row of a Transonic
Axial-Flow Fan i
11-Results and Discussion
M.D. Hathaway Propulsion Directorate U. S. Amy Aviation Research and Technology Activity-A VSCOM
Lewis Research Center Ctevehd, Ohio
K.L. Suder Lewis Research Center Cleveldnd, Ohio
T.V. Okiishi Iowa State University Ames, Iowa
A.J. Strazisar and J.J. Adamczyk Lewis Research Center Cleveland, Ohio
Prepared for the 32nd International Gas Turbine Conference and Exhibition sponsored by the American Society of Mechanical Engineers
Anaheim, California, May 31-June 4,
1987
3 m \I
r ABS1 KACl
MEASUKtMtNTS OF THE UNSltADY FLOW FIELD WITHIN THE SlAlOR ROW OF A TRANSONIC AXIAL-FLOW FAN
I1 - Results and Discussion
M.D. Hathaway Propulsion Directorate U.S. Army Aviation Research and Technology Activity - AVSCOH Lewis Research Center. Cleveland, Ohio 44135
K.L. Suder National Aeronautics and Space Administration Lewis Research Center Cleveland, Ohio 44135
l.H. Okiishi Iowa State University Ames. Iowa 50010
A.J. Strazisar and J.J. Adamczyk National Aeronautics and Space Administration Lewis Research Center Cleveland, Ohio 44135
Results of detailed unsteady velocity field meas urements made within the stator row of a transonic axial-flow fan are presented. Measurements were obtained at midspan for two different stator blade rows using a laser anemometer. The first stator row
consists of double-circular-arc airfoils with a solid
ity of 1.68. The second stator row features controlled-diffusion airfoils with a solidity of
0.85.
Both stator ionf?guraiionj Ljere tested at dejign-jpeed peak efficiency conditions. In addition, the con- trolled-diffusion stator was also tested at near-stall
Q design-speed conditions. The procedures developed in
m Part I of this paper are used to identify the "rotor- m
m W I wake-generated" and "unresolved" unsteadiness from the
velocity measurements. (lhe term "rotor-wake generated" unsteadiness refers to the unsteadiness generated by the rotor wake velocity deficit and the term "unresolved" unsteadiness refers to all of the remaining unsteadiness which contributes to the spread in the distribution of velocities such as vortex shed- ding, turbulence, etc.) Auto and cross correlations of
these unsteady velocity fluctuations are presented in order to indicate their relative magnltude and spatial distributions. Amplification and attenuation of both rotor-wake generated and unresolved unsteadiness are shown to occur within the stator blade passage.
NOMtNCLA1 UR€
Ns number of stator blades R radius, cm S distance measured along stator exit mean camber line, cm V veiocity magnitude, niis VA VFs freestream velocity, mis average of stator inlet and exit velocities, m/s z VT total absolute velocity = fVZc + V, , mis r stator blade clrculation, m2/s . Subs c r-cs. I stator row inlet condition E stator row exit condition z axiai component r tangential component 1 measured component in direc orientation angle 2 measured component in dlrec orientation angle 5 !EE s_c_r_iqts fluctuatlng component - ensemble average ion of first beam ion of second beam steady-state condition, or time average - AX axisyrnmetric component IN1 KODlJCl ION In the past, unsteadiness in turbomachines has been generally categorized as being either "periodic" or "random" ("turbulent"). Flow-fleld fluctuations resulting from the relative motion between blade rows have been categorized as "periodic" unsteadiness. "Random" unsteadiness has been used as a catch-all term which inciuaes iiow-iieia iiuctuations due to turbu- lence, vortex shedding, global flow-field fluctuations, true random unsteadlness. and any other unsteadiness not correlated with rotor speed. Therefore, in the absence of a more descriptive terminology for unsteady
1