Size and boundary effects in discrete dislocation dynamics: Coupling with continuum finite element

Hasan Yasin, Hussein M. Zbib, Moe A. Khaleel

Research output: Contribution to journalArticlepeer-review

62 Scopus citations

Abstract

In this work we develop a framework coupling continuum elasto-viscoplacity with three-dimensional discrete dislocation dynamics (micro 3d). The main problem is to carry out rigorous analyses to simulate the deformation of single crystal metals (fcc and bcc) of finite domains. While the overall macroscopic response of the crystal is based on the continuum theory, the constitutive response is determined by discrete dislocation dynamics analyses using micro 3d. Size effects are investigated by considering two boundary value problems: (1) uniaxial loading of a single crystal cube, and (2) bending of a crystal micro-beam. It is shown that boundary conditions and the size of the computational cell have significant effect on the results due to image stresses from free-boundaries. The investigation shows that surface effects cannot be ignored regardless of the cell size, and may result in errors as much as 10%. Preliminary results pertaining to dislocation structures under conditions are also given. Published by Elsevier Science B.V.

Original languageEnglish
Pages (from-to)294-299
Number of pages6
JournalMaterials Science and Engineering: A
Volume309-310
DOIs
StatePublished - Jul 15 2001
Externally publishedYes

Funding

The support of the National Science Foundation under grant number CMS-9634726 and the partial support of the Pacific Northwest National Laboratory is gratefully acknowledged.

FundersFunder number
National Science FoundationCMS-9634726
Pacific Northwest National Laboratory

    Keywords

    • Discrete dislocation dynamics
    • Elasto-viscoplasticity
    • Finite element analysis

    Fingerprint

    Dive into the research topics of 'Size and boundary effects in discrete dislocation dynamics: Coupling with continuum finite element'. Together they form a unique fingerprint.

    Cite this